Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Ollama #188

Open
dhirenmathur opened this issue Nov 22, 2024 · 3 comments
Open

Add support for Ollama #188

dhirenmathur opened this issue Nov 22, 2024 · 3 comments
Assignees
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@dhirenmathur
Copy link
Contributor

Overview
Add support for Ollama to enable users to run open source models locally.
This feature is required for privacy focused users who do not want to share their code with LLM providers.

Requirements

  • Integration with Ollama API using langchain
  • Support for multiple open source models available through Ollama
  • Seamless switching between cloud and local models using the providers API

Technical Details

  • Implement Ollama API client
  • Add configuration options for Ollama endpoint and model selection
  • Ensure compatibility with existing application interfaces

Success Criteria

  • Users can run local models through Ollama
  • Successful Knowledge graph creation and agent execution.
@dhirenmathur dhirenmathur added enhancement New feature or request help wanted Extra attention is needed labels Nov 22, 2024
@waveywaves
Copy link
Contributor

@dhirenmathur I would like to work on this, can you assign it to me.

@dhirenmathur
Copy link
Contributor Author

@waveywaves are you working on this or can I assign it to someone else?

@waveywaves
Copy link
Contributor

waveywaves commented Dec 26, 2024

@dhirenmathur hey Dhiren! continuing work on this... shall post an update through the next day

vishwamartur added a commit to vishwamartur/potpie that referenced this issue Jan 5, 2025
Related to potpie-ai#188

Add support for Ollama to enable users to run open source models locally.

* **Provider Service Integration**
  - Add Ollama API integration in `app/modules/intelligence/provider/provider_service.py`
  - Implement method to get Ollama LLM
  - Update `list_available_llms` method to include Ollama

* **Configuration Options**
  - Add configuration options for Ollama endpoint and model selection in `app/core/config_provider.py`
  - Update `ConfigProvider` class to include Ollama settings

* **Agent Factory and Injector Service**
  - Add support for Ollama models in `app/modules/intelligence/agents/agent_factory.py`
  - Implement method to create Ollama agent
  - Add support for Ollama models in `app/modules/intelligence/agents/agent_injector_service.py`
  - Implement method to get Ollama agent

* **Tool Service**
  - Add tools for Ollama model support in `app/modules/intelligence/tools/tool_service.py`
  - Implement methods to interact with Ollama models
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants