Non-Streaming Output from LLM Model in Dify Usage #12393
Replies: 2 comments
-
There is a known issue in Dify 0.14.2 that could prevent streaming from functioning as expected in a self-hosted environment. When using a My-LLM model deployed by MS-Swift, streaming mode fails due to the response data lacking a leading space after 'data:' in the streaming response. This formatting issue causes compatibility problems with Dify's streaming implementation [1]. To enable streaming output for the LLM model in your self-hosted Dify setup, ensure that the If you are already setting To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
i use Genimi, it also not work |
Beta Was this translation helpful? Give feedback.
-
Self Checks
Dify version
0.14.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Dear Support Team,
I am encountering an issue while using Dify, where the LLM (Language Model) does not seem to provide a streaming output. This behavior is consistent even when I use the most basic chat assistant configurations. Interestingly, testing the same model on the official website at https://cloud.dify.ai/ yields the expected streaming output functionality.
After reviewing existing issues, I have not found any settings or configurations related to enabling or disabling the stream feature within the application or documentation. Given this, it's unclear how to proceed to achieve the desired streaming output in my local or custom deployment of Dify.
Could you please advise on whether there are specific settings or configurations that need to be adjusted to enable streaming output? If this is a known limitation or if there are workarounds available, your guidance would be greatly appreciated.
Thank you for your support.
✔️ Expected Behavior
steam
❌ Actual Behavior
No response
Beta Was this translation helpful? Give feedback.
All reactions