Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server : (webui) introduce conversation branching + idb storage #11792

Merged
merged 5 commits into from
Feb 10, 2025

Conversation

ngxson
Copy link
Collaborator

@ngxson ngxson commented Feb 10, 2025

In this PR:

  • Add conversation "branching", allow user to keep the old conversation history when edit / regenerate a message ==> Implemented using node-based system, which is the same as ChatGPT and Hugging Chat
  • Add IndexedDB as main storage, powered by dexie library
  • Auto migration from localStorage to IndexedDB upon running the first time
image

Auto-migration

Upon launching this new version, the auto migration will be triggered automatically. This will copy all conversations from localStorage to IndexedDB. Please note that:

  • Old conversations are not deleted from localStorage
  • A new item named migratedToIDB will be stored in localStorage to mark that the migration has already been done, preventing duplicated migration

How it works

The content below is copied from types.ts

What is conversation "branching"? It is a feature that allows the user to edit an old message in the history, while still keeping the conversation flow.
Inspired by ChatGPT / Claude / Hugging Chat where you edit a message, a new branch of the conversation is created, and the old message is still visible.

We use the same node-based structure like other chat UIs, where each message has a parent and children. A "root" message is the first message in a conversation, which will not be displayed in the UI.

 root
  ├── message 1
  │      └── message 2
  │             └── message 3
  └── message 4
        └── message 5

In the above example, assuming that user wants to edit message 2, a new branch will be created:

          ├── message 2
          │   └── message 3
          └── message 6

Message 2 and 6 are siblings, and message 6 is the new branch.

We only need to know the last node (aka leaf) to get the current branch. In the above example, message 5 is the leaf of branch containing message 4 and 5.

For the implementation:

  • StorageUtils.getMessages() returns list of all nodes
  • StorageUtils.filterByLeafNodeId() filters the list of nodes from a given leaf node

@ngxson ngxson marked this pull request as ready for review February 10, 2025 14:26
@ngxson ngxson requested a review from ggerganov February 10, 2025 14:26
@ngxson ngxson merged commit 507f917 into ggml-org:master Feb 10, 2025
6 checks passed
tinglou pushed a commit to tinglou/llama.cpp that referenced this pull request Feb 13, 2025
…-org#11792)

* server : (webui) introduce conversation branching + idb storage

* mark old conv as "migrated" instead deleting them

* improve migration

* add more comments

* more clarification
orca-zhang pushed a commit to orca-zhang/llama.cpp that referenced this pull request Feb 26, 2025
…-org#11792)

* server : (webui) introduce conversation branching + idb storage

* mark old conv as "migrated" instead deleting them

* improve migration

* add more comments

* more clarification
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Feb 26, 2025
…-org#11792)

* server : (webui) introduce conversation branching + idb storage

* mark old conv as "migrated" instead deleting them

* improve migration

* add more comments

* more clarification
ubergarm pushed a commit to ubergarm/llama.cpp that referenced this pull request Mar 1, 2025
…-org#11792)

* server : (webui) introduce conversation branching + idb storage

* mark old conv as "migrated" instead deleting them

* improve migration

* add more comments

* more clarification
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants