-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Should .drop
's dropped items be read in parallel?
#1
Comments
.drop
's dropped items be read be parallel?.drop
's dropped items be read in parallel?
IMHO, It should be done in sequence as a matter of practicality. In my experience with RxJS, in-sequence async is a lot easier for people to reason about than parallel async, in particular when it comes to an async sequence of values. A lot of the landmines people step on with RxJS have to do with using parallel operations on a series of values and expecting them to always come back in a particular order. People are just really bad about understanding that. AsyncIterable, in particular, lends itself VERY WELL to backpressure control because it mostly deals with async values in sequence, and I think that having it iterate a bunch of values and let them resolve in parallel kind of goes against that advantage it has. Given that nearly everything else about how people would deal with If it can be explained like this it's easier to understand, IMO: async function* drop(source: AsyncIterable, count: number) {
let seen = 0
for await (const value of source) {
if (seen < count) {
seen++;
} else {
yield value;
}
}
} As a general rule, I'd recommend sticking to the simple, one-at-a-time semantic of AsyncIterable as read via |
@benlesh Yeah, simplicity has a lot going for it. Maybe we could make concurrency here opt-in via a second parameter. Or maybe we could just let people do it themselves when they want concurrency, though getting the error handling right is a little tricky. |
Parameterized concurrency could be done in a follow up as well. FWIW, RxJS's parameterized concurrency is very rarely used. I know that's anecdotal, but it's a pretty big anecdote. |
@benlesh That's good feedback. For some context, the reason this proposal is currently being held back is because we want to make sure there's room for adding consumer-driven concurrency at some point, which does necessarily affect the design now, not just in a followup. Consider the following snippet: let x = asyncIteratorOfUrls
.map(u => fetch(u));
await Promise.all([
x.next(),
x.next(),
]); The idea is that if you go out of your way to call Importantly, if you're just doing a normal But yes, probably people who are just calling |
Consider:
Should the first four fetches (the three dropped, and then one which will be returned) happen in parallel, or in sequence?
That is, should the first call to
.next
on the.drop(N)
helper be likeor
? (Some details of error handling omitted.)
I am inclined to say the former (they should be parallel), but that implies maybe we need a mechanism to for async functions to limit their degree of parallelism (like in this npm package).
The text was updated successfully, but these errors were encountered: