Releases: aingdesk/AingDesk
v1.1.7
feat: add and optimize several features
- Add support for parsing Mermaid standard syntax output format
- Introduce configuration option for custom Ollama service interface address to enhance flexibility
- Synchronize and maintain the latest Ollama model library list
fix: resolve issues
- Display error messages when model invocation fails to facilitate troubleshooting
- Fix issue where the Ollama model storage directory configuration did not take effect immediately
v1.1.6
- Fixed the issue where knowledge base document embedding would infinitely wait in certain scenarios.
- Added a mechanism to detect the availability of embedding models.
- Improved the PDF parser to enhance usability and better preserve formatting.
- Added support for selecting the model installation location.
v1.1.5
v1.1.4
chore: optimize certain features and fix related issues
- Automatically use OCR model to extract text from images when uploading to non-vision models, then pass the context to the large model
- Fix issue where images failed to be passed to third-party models
- Fix failure in extracting DOC/PDF/MD documents caused by image extraction errors
fix: resolve document and image processing issues
- Ensure successful extraction of documents even if image extraction fails
v1.1.3
feat: add new features and enhance version stability
- Add conversation attachment functionality
- Add option for conversation without memory
- Introduce automatic context fine-tuning mechanism for ollama (2048-4096/<=4b 8192)
- Allow selection of third-party models and knowledge bases in sharing functionality
- Adjust knowledge base initialization interaction logic
fix: resolve multiple issues
- Fix ollama installation failure for certain installation paths
- Fix issue where close button does not fully exit the application on MacOS
- Fix PDF parsing issue in knowledge base on MacOS
- Update to the latest ollama version and model list
- Resolve other known issues
v1.1.2
v1.1.1
v1.0.7
- Adjusted the behavior of the close button to directly exit the program (reverting the previous change).
- Fixed the issue where some models installed via
ollama pull
could not be recognized, such as thelatest
version. - Resolved the issue where the model list could only be retrieved after restarting the process when the Ollama service was stopped.
- Adjusted the automatic setting of the environment variable
OLLAMA_HOST=127.0.0.1
for Ollama installations via AingDesk to prevent potential malicious exploitation.
v1.0.6
-
Optimized exclusive prompts for deepseek-r1.
-
Fixed the issue where the default backend language was English on macOS.
-
Improved the Ollama download mechanism with automatic node switching and resumable downloads.
-
Adjusted the behavior of the close button so that clicking it minimizes the program to the system tray instead of exiting.
-
Added a caching mechanism for web searches.
-
Adjusted the regeneration strategy for user queries during web searches.
-
Fixed the issue where the left and right spacing in the chat window was disproportionately large.
-
Resolved the issue where chat scrolling during the generation phase did not behave as expected.