-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add stream
helper
#395
Conversation
I know bumping the minimum Node version is usually a breaking change, but as this is a Node version that hasn't been supported by Lambda for a very long time I don't think it's an actual breaking change. In fact I'd bump it to 14 rather than 12. |
* // ... | ||
* return { | ||
* statusCode: 200, | ||
* body: response.body, // Web stream |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure I understand the differentiation between a ReadableStream
and a Web stream, since response.body
is a ReadableStream
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Readable
is the old NodeJS class, and ReadableStream
is the WHATWG interface that's been added in Node v18 under node:stream/web
. Node.JS refers to the streams from WHATWG Fetch spec as Web Streams
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I hadn't realised that was the term they used. I left a suggestion for using WHATWG instead which I think might be slightly clearer, but feel free to take it or leave it.
* @see https://ntl.fyi/streaming-func | ||
*/ | ||
const stream = (handler: StreamingHandler): Handler => | ||
awslambda.streamifyResponse(async (event, responseStream, context) => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the plan for making this with in local development? Should we add this to https://github.com/ashiina/lambda-local?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's probably the best way forward, yes. I think we already got one PR in there.
Co-authored-by: Eduardo Bouças <mail@eduardoboucas.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Left a couple of non-blocking comments. We should hold off on releasing this until streaming is enabled across the board, though.
* // ... | ||
* return { | ||
* statusCode: 200, | ||
* body: response.body, // Web stream |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I hadn't realised that was the term they used. I left a suggestion for using WHATWG instead which I think might be slightly clearer, but feel free to take it or leave it.
* @example | ||
* ``` | ||
* export const handler = stream(async (event, context) => { | ||
* const response = await fetch('https://api.openai.com/', { ... }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Maybe safer to use one of our own URLs in the example?
* const response = await fetch('https://api.openai.com/', { ... }) | |
* const response = await fetch('https://api.netlify.com/api/v1/user', { ... }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I explicitly chose OpenAI because streaming responses are useful in that context - whereas I'm not aware of any Netlify-owned API where streaming makes a difference. Let's stick with openai for now :)
Co-authored-by: Eduardo Bouças <mail@eduardoboucas.com>
🤖 I have created a release *beep* *boop* --- ## [1.6.0](v1.5.0...v1.6.0) (2023-05-12) ### Features * add `stream` helper ([#395](#395)) ([7b305cf](7b305cf)) --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). Co-authored-by: token-generator-app[bot] <82042599+token-generator-app[bot]@users.noreply.github.com>
This PR adds a helper to support streaming responses in Netlify Functions. The decorator handles all the
awslambda
things under the hood, all that devs have to do is return aNodeJS.Readable
or a Web Stream as thebody
.It also updates the Node.js version to v14, so that
pipeline
is available, which makes this a breaking change technically - but as @ascorbic notes, not an actual one.