Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to get any response from Retriever #16

Open
hemanthaar opened this issue Aug 13, 2024 · 6 comments
Open

Unable to get any response from Retriever #16

hemanthaar opened this issue Aug 13, 2024 · 6 comments

Comments

@hemanthaar
Copy link

hemanthaar commented Aug 13, 2024

After struggling a lot to run the chainlit app, I am facing a new issue. The agents are not able to retrieve any information even though the documents are successfully indexed.

image

I am getting only this response multiple times no matter what the question is.

image

Can you suggest any solution or workaround. Thanks.

@karthik-codex
Copy link
Owner

What is your LLM config looks like? Are you using the api_base of LiteLLM?

@hemanthaar
Copy link
Author

Yes. Here is my config detail.

llm_config_autogen = {
"seed": 42, # change the seed for different trials
"temperature": 0,
"config_list": [{"model": "litellm",
"base_url": "http://127.0.0.1:4000/",
'api_key': 'ollama'},
],
"timeout": 60000,
}

@tdemel
Copy link

tdemel commented Aug 17, 2024

I am having the same issue. The error message can be located in the autogen code and I thought it may have to do with the versioning of autogen as I can otherwise connect to my Ollama models without issues and a new Autogen version was released on 13 August. However, even with the previous Autogen version I am receiving the same error. I haven't tried with a much older version. It would be great if someone knows a solution to this.

@hemanthaar
Copy link
Author

@tdemel You are right. It didn't occur to me that pyautogen version could be issue. You can install autogen version 0.2.32 and it works but the response comes in the terminal instead of chainlit app. Let me know if you find any solution for this.

image

image

@zluckymn
Copy link

0.2.32
i got the correct message by impelementing the following steps:

CONFIG_FILEPATH = './settings.yaml'

def run_local_search(
config_filepath: str | None,
data_dir: str | None,
root_dir: str | None,
community_level: int,
response_type: str,
streaming: bool,
query: str,
):

if LOCAL_SEARCH:
result = run_local_search(CONFIG_FILEPATH,INPUT_DIR, ROOT_DIR, COMMUNITY ,RESPONSE_TYPE,True, question)
else:
result = run_global_search(CONFIG_FILEPATH,INPUT_DIR, ROOT_DIR, COMMUNITY ,RESPONSE_TYPE,True, question)

@michael-hoon
Copy link

still having the same issue here, has anyone managed to solve this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants