DevIO is a professional Flutter application designed to connect to local LLM (Large Language Model) servers from your macOS desktop or mobile device. It features a clean, modern, and minimal UI that prioritizes ease of use while providing powerful functionality.
DevIO transforms your device into a powerful interface for interacting with locally hosted large language models. Connect to Ollama or other LLM servers to leverage the power of AI while keeping your data private and secure.
- macOS (Universal)
- iOS
- Android (coming soon)
- Web (experimental)
- Connect to locally hosted Ollama instances with customizable server configuration
- Supports various open-source models (llama3, deepseek, mistral, phi3, etc.)
- Performance metrics tracking and real-time monitoring
- Automatic reconnection and connection status indicators
- Advanced settings for context size, timeout, and thread configuration
- Clean, minimal design with both dark and light theme support
- Responsive layout optimized for various screen sizes
- Integrated with Flutter Bloc/Cubit for robust state management
- Onboarding screens for new users with interactive setup guide
- Multi-session chat management with conversation history
- Code highlighting and formatting for code snippets
- Image analysis with multimodal model support
- Document handling with PDF support
- Real-time typing indicators and message status
- Firebase authentication with multiple sign-in options
- Optional cloud synchronization for chat history
- Local processing keeps sensitive data on your device
- Demo mode for exploring features without a connection
- Flutter SDK (>=3.0.0)
- Dart SDK (>=3.0.0)
- A running Ollama server (local or remote)
- macOS 12.0 or later (for desktop version)
- iOS 15.0 or later (for mobile version)
- (Optional) Firebase project for cloud features
- Clone the repository:
git clone https://github.com/callmeartan/devio.git
cd devio
- Install dependencies:
flutter pub get
-
Configure your environment:
- Create a
.env
file in the project root with:
OLLAMA_HOST=localhost:11434 # Change to your Ollama server address
- Create a
-
Build and run:
flutter run
DevIO is designed to work seamlessly with locally hosted LLM servers. By default, it connects to Ollama running on localhost:11434.
- Install Ollama from ollama.ai
- Pull your preferred models:
ollama pull llama3
ollama pull deepseek-r1:8b
ollama pull mistral:7b
ollama pull phi3:14b
- Start the Ollama server with network access:
OLLAMA_HOST=0.0.0.0:11434 ollama serve
To connect to a remote Ollama instance:
- Ensure the remote server is accessible
- Update the server address in the app settings
- Use the built-in connection test to verify connectivity
DevIO follows modern Flutter architecture patterns:
- Clean Architecture with separation of concerns
- BLoC/Cubit Pattern for state management
- Feature-first Structure organized by functionality
- Repository Pattern for data access abstraction
- Flutter 3.x for cross-platform UI development
- Firebase for authentication, storage, and cloud features
- go_router for navigation
- flutter_bloc for state management
- freezed for immutable state classes
- http for API communication with LLM servers
- shared_preferences for local settings storage
lib/
├── blocs/ # Bloc state management
├── constants/ # App constants and configurations
├── cubits/ # Cubit state management
├── features/ # Feature modules
│ ├── llm/ # Core LLM functionality
│ └── settings/ # App configuration
├── models/ # Data models
├── providers/ # Provider implementations
├── repositories/ # Data repositories
├── screens/ # UI screens
├── services/ # Service implementations
├── theme/ # Theming and styling
├── utils/ # Utility functions
├── widgets/ # Reusable UI components
├── main.dart # Application entry point
└── router.dart # Navigation configuration
# macOS
flutter build macos --release
# iOS
flutter build ios --release
# Android
flutter build apk --release
flutter pub run build_runner build --delete-conflicting-outputs
- Windows and Linux support
- Audio input and output capabilities
- Advanced document analysis and summarization
- Plugin support for extending functionality
- Local model management interface
- Multi-user collaborative features
- Enhanced multimodal support
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- The Ollama team for making local LLMs accessible
- Open-source LLM communities
- Flutter team for the amazing cross-platform framework
- All contributors to this project