You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Overview
Add support for Ollama to enable users to run open source models locally.
This feature is required for privacy focused users who do not want to share their code with LLM providers.
Requirements
Integration with Ollama API using langchain
Support for multiple open source models available through Ollama
Seamless switching between cloud and local models using the providers API
Technical Details
Implement Ollama API client
Add configuration options for Ollama endpoint and model selection
Ensure compatibility with existing application interfaces
Success Criteria
Users can run local models through Ollama
Successful Knowledge graph creation and agent execution.
The text was updated successfully, but these errors were encountered:
Related to potpie-ai#188
Add support for Ollama to enable users to run open source models locally.
* **Provider Service Integration**
- Add Ollama API integration in `app/modules/intelligence/provider/provider_service.py`
- Implement method to get Ollama LLM
- Update `list_available_llms` method to include Ollama
* **Configuration Options**
- Add configuration options for Ollama endpoint and model selection in `app/core/config_provider.py`
- Update `ConfigProvider` class to include Ollama settings
* **Agent Factory and Injector Service**
- Add support for Ollama models in `app/modules/intelligence/agents/agent_factory.py`
- Implement method to create Ollama agent
- Add support for Ollama models in `app/modules/intelligence/agents/agent_injector_service.py`
- Implement method to get Ollama agent
* **Tool Service**
- Add tools for Ollama model support in `app/modules/intelligence/tools/tool_service.py`
- Implement methods to interact with Ollama models
Overview
Add support for Ollama to enable users to run open source models locally.
This feature is required for privacy focused users who do not want to share their code with LLM providers.
Requirements
Technical Details
Success Criteria
The text was updated successfully, but these errors were encountered: