-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[feat] support for streaming via Async Iterator endpoints #2212
Conversation
|
c8bd775
to
2fc6b06
Compare
Today I've added some test coverage for async iterator support, including for the dev server (basics), adapter-static and adapter-node, and uncovered & fixed some bugs as well as added support for adapter-static in the process. I also rebased. I think the PR is now in good shape - the tests have definitely added confidence that things should be working well. Looking forward to any feedback whether any additional changes are needed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Your PR looks good but could you check my notes? Try running the following stream server and serving { body: await fetch('/stream').body }
from SvelteKit. It looks to keep streaming after disconnection, unless I'm messing something up?
function handle_request(request, response) {
if (request.url === '/stream') {
console.log('request opened');
let interval = 0;
const stream = new Readable({
construct(callback) {
console.log('stream opened');
const send = () => {
const chunk = Math.random().toString();
console.log('stream push', chunk);
this.push(chunk);
}
interval = setInterval(send, 1000);
callback();
},
read() {},
destroy(error, callback) {
console.log('stream closed');
clearInterval(interval);
callback(error);
},
});
response.on('close', () => {
console.log('request closed');
stream.destroy();
});
response.writeHead(200);
stream.pipe(response);
}
}
const server = http.createServer(handle_request);
server.listen(8080);
@jesseskinner anything I can do to help get this PR over the finish line? |
@AndreasHald I think it's done, just waiting for a review. @mrkishi had some comments for @kjmph but I'm not sure how critical those bits are for getting this merged. 🤷♂️ |
I'll make the relevant changes, my apologies on my delay. I think @mrkishi is correct, we should handle the cancel of the ReadableStream for Cloudflare support as well as the closure of the pipe for Node.js support. I'll make the change when free (today or so). |
This commit demonstrates the ability to easily add Async Iterator SvelteKit Endpoints. By having an endpoint return an async iterator as the body, SvelteKit will keep emitting writes for as long as the endpoint yields values. Or, for as long as the client holds the request open; which ever ends first.
I don't have Cloudflare workers setup, so this still needs testing.
1153900
to
32b2184
Compare
Forgive my whack a mole, I see the bar for pull requests has been raised on this project. Good on ya. Anyway, I cleaned up everything I can find, I clearly don't understand how to run the tests on my own machine, yet I was able to fix everything I can find short of the timeout. I'm uncertain if the timeout is due to these changes. Note, there was an Either/Or type added to calm down TypeScript checking; perhaps that causes tests to take longer? |
Oh would you like documentation updated in this pull request? Or, a new nav item added to demonstrate a streaming endpoint? Now that the preview is auto-deployed (Thanks Vercel), it could be helpful for maintainers. |
Thanks @Rich-Harris! |
This commit demonstrates the ability to easily add Async Iterator
SvelteKit Endpoints. By having an endpoint return an async iterator as
the body, SvelteKit will keep emitting writes for as long as the
endpoint yields values. Or, for as long as the client holds the
request open; which ever ends first.
Most of the discussion is occurring over on #1563; opening pull request to be illustrative. I believe there are a few type errors that will need to be cleaned up before merging.