-
Notifications
You must be signed in to change notification settings - Fork 817
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check partial answers for ongoing requests #141
Comments
I have made a proposal for progress in #70 |
This is more than progress report. I'd like to be able to get a list of partial results for an ongoing request. This should be something the client requests. |
Why not? Personally I very much like the idea of having a fully async and incremental mechanic for sending completions (for example). Having this allows for server to deliver any results as soon it has them rather than having to buffer them and make an awkward implementation decision around how much time is 'reasonable' to try and compute as many results as possible before returning an incomplete list. This often means you end up returning 0 completions even if the user was willing to wait a bit (or just stopped typing for a moment to reflect). |
I think also that it would be neat with a streaming API, fully async. It has the downside is that it offers no flow control. Also, I believe that clients might have trouble handling it properly, thus making it less likely to get implemented. I started with a less ambitious suggestion, hoping to get it accepted easier. I have absolutely no objection against a fully async protocol, when it is supported by both the client and the server. There will have to be a fallback solution for participants that need flow control or can't handle the async streams of events. |
With the new capability flag support in 3.0 I can imagine a flag 'supportsResultStreaming' on the completion capability to indicate that the client accept streamed results. Then the server can stream results. |
How exactly would that work? Streaming depends entirely on the client parsing the response as a JSON stream, for example if it is an array emitting each array item instantly and not waiting for the whole response to parse it completely. EDIT: Okay, it doesn't work, because the server needs to send a |
Not necessarily, I think. The server could also send multiple answers to a request, with a flag saying if it's done or not. There should be a merging policy for the non-array values. This way, it works fine even if a response would contain two arrays that are computed in parallel (there isn't such a response now, but there could be). |
Agree with @vladdu. This can be achieved using a paging support. That would even allow the client to request all results if a user picks a already sent one in the example of code complete. |
Partial results are possible with our streaming implementation: #182 |
Marking as dup of #110 |
For long-running requests, it makes sense to show partial results in the UI (for example when searching for all references in a workspace, but even for completion). Streaming them from the server is not a very good idea, but the client could send progress requests whenever it wants and receive the current results, updating the UI. Kind of like
cancelRequest
but without cancelling.The text was updated successfully, but these errors were encountered: