-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to get any response from Retriever #16
Comments
What is your LLM config looks like? Are you using the api_base of LiteLLM? |
Yes. Here is my config detail. llm_config_autogen = { |
I am having the same issue. The error message can be located in the autogen code and I thought it may have to do with the versioning of autogen as I can otherwise connect to my Ollama models without issues and a new Autogen version was released on 13 August. However, even with the previous Autogen version I am receiving the same error. I haven't tried with a much older version. It would be great if someone knows a solution to this. |
@tdemel You are right. It didn't occur to me that pyautogen version could be issue. You can install autogen version 0.2.32 and it works but the response comes in the terminal instead of chainlit app. Let me know if you find any solution for this. |
CONFIG_FILEPATH = './settings.yaml' def run_local_search( if LOCAL_SEARCH: |
still having the same issue here, has anyone managed to solve this? |
After struggling a lot to run the chainlit app, I am facing a new issue. The agents are not able to retrieve any information even though the documents are successfully indexed.
I am getting only this response multiple times no matter what the question is.
Can you suggest any solution or workaround. Thanks.
The text was updated successfully, but these errors were encountered: