-
Notifications
You must be signed in to change notification settings - Fork 8.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When the workflow configuration becomes long, the stream output format is lost #11985
Comments
Hey @ITMeow! 👋 I'm here to help you with any bugs, questions, or contributions you may have while waiting for a human maintainer. Let's squash those bugs together! To maintain the streaming output format in a self-hosted Dify workflow when the configuration becomes long and reaches the last LLM node, you should add a Direct Reply Node inside the iteration node. This setup allows for streaming output after each iteration, ensuring that the output is maintained in a streaming format even when the configuration is extensive [1]. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
I think you misunderstood me, I said that there are many nodes in chatflow, not long text, and I did not use iterative nodes, until update 0.14.0, llm output is able to display the streaming form well |
After I removed the variable aggregator, the output is still blocking rather than streaming. Could you please guide me on how to modify the yml file to resolve this issue? |
I encountered the same problem, simple workflows are normal, but complex workflows do not have streaming outputs. version:0.14.2 |
can you share the dsl or the prompt? |
+1 Experiencing exactly the same issue |
@laipz8200 Sorry to bother you. At your convenience, could you please review this issue? It's currently impacting our production business. Really appreciate your help. |
can you share your dsl I would like to give a look |
Sure, what's the best way to share it privately with you? |
email is ok zouzou0208@gmail.com |
@yihong0618 Same here, if we add a simple if node, the stream is lost! you can try that dsl on the cloud. Gravacao.de.Tela.2025-01-03.as.12.11.37.movhttps://drive.google.com/file/d/1-8CWhWeC0p-BIDlzeDmvHd1JE3Q87hHJ/view?usp=sharing |
can you try this patch @Kevin9703 @ritamariavermelho06 |
Works now! But wondering what was the root cause? ? |
The logic here is a bit complicated, you can check the diff and the history, |
Self Checks
Dify version
0.14.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
From the image, it can be seen that when there are many nodes configured in front, such as knowledge base retrieval and code execution, when the process reaches the last LLM node, the streaming output turns into a blocking mode output, including API output as well
✔️ Expected Behavior
When the last LLM node outputs, keep the streaming output
❌ Actual Behavior
Now in blocking output
The text was updated successfully, but these errors were encountered: