-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Want to run fully locally using OLLAMA and SEARXNG #217
Comments
Hi, have you tried the example script here: https://github.com/stanford-oval/storm/blob/main/examples/storm_examples/run_storm_wiki_ollama_with_searxng.py |
What is exactly the searxng_api_url of Searxng? Is it just the root domain, or /search?q= ? Because mine results in unsuccessful outcomes: python run_storm_wiki_ollama_with_searxng.py --searxng-api-url http://localhost:MY_PORT --do-research --do-generate-outline Searxng works elsewhere (e.g. Open-Webui). Any ideas what might be wrong? |
This seems to work for me ~50% of the time. And gives me errors like this the other half:
|
ok, my bug seem to be fixed by updating ollama. The url for searxng is "http://localhost:MY_PORT/search? " |
can anyone please guide me to create this fully locally in my pc
llm as OLLAMA CLIENT
web search searXNG
i will be very must greatful if you help me through
The text was updated successfully, but these errors were encountered: