With AI SDK UI- How to lazy load response #3348
Unanswered
Godrules500
asked this question in
Help
Replies: 2 comments 1 reply
-
additional info. I can obviously not await it and it works, but the loading indicator doesn't show up until streamText starts responding still. So I essentially need a way to return a "fake" response, and then have streamText tie into it once it starts its thing. |
Beta Was this translation helpful? Give feedback.
0 replies
-
You can check the loading state and create UI elements as needed when it loading. See this example: https://sdk.vercel.ai/docs/ai-sdk-ui/chatbot#loading-state |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
With RSC, I was able to get the loading indictaor to display immediately to the user because I replied to the user before processing the user's request (loading images, manipulating the messages, etc). Then as the AI responded, I would display it to the user. How can I do this with the AI SDK UI?
Essentially I want to do
How do I create this exact setup with vercel ai sdk ui in a way that streamText will still display to the user once Gemini responds?
Beta Was this translation helpful? Give feedback.
All reactions