Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make the default chat window memory size configurable #298

Closed
michaelchia opened this issue Jul 27, 2023 · 3 comments
Closed

Make the default chat window memory size configurable #298

michaelchia opened this issue Jul 27, 2023 · 3 comments
Labels
enhancement New feature or request @jupyter-ai/chatui project:config Configuration mechanisms

Comments

@michaelchia
Copy link
Collaborator

Problem

Chat conversation memory window of 2 is a bit small especially as people are used to long memories from their experiences with ChatGPT.

Proposed Solution

  • Make the k param in the ConversationBufferWindowMemory of the DefaultChatHandler configurable.
  • Preferably as a setting in the UI.
  • But short-term fix of setting as some global constant like MEMORY_K = 2 which I am able to configure would be fine in the meantime.
  • Can consider as provider-level configuration with provider-level default if differing context windows sizes are an issue.

Additional context

@michaelchia michaelchia added the enhancement New feature or request label Jul 27, 2023
@dlqqq
Copy link
Member

dlqqq commented Jul 27, 2023

@michaelchia Yes, agreed. We set that limit earlier in 0.x because a lot of users were running into token limit issues. We're working on two different tasks in parallel:

  1. A more robust configuration system
  2. A better strategy for tracking token usage and limiting prompt size

I will make your issue a priority for when I implement the configuration system.

@JasonWeill JasonWeill added this to the 2.2.0 Release milestone Jul 27, 2023
@JasonWeill JasonWeill added the project:config Configuration mechanisms label Aug 28, 2023
@JasonWeill
Copy link
Collaborator

Related to #218, a major config refactor.

@JasonWeill JasonWeill removed this from the 2.3.0 Release milestone Jan 25, 2024
@dlqqq
Copy link
Member

dlqqq commented Feb 4, 2025

This issue was fixed long ago, but we forgot to close this issue. Users can now use --AiExtension.default_max_chat_history=... to set the number of historical messages passed to the chat model. 🎉

@dlqqq dlqqq closed this as completed Feb 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request @jupyter-ai/chatui project:config Configuration mechanisms
Projects
None yet
Development

No branches or pull requests

3 participants