Accessing all messages on onFinish hook for saving chat history [useChat] #2013
-
When using const {
messages,
input,
setInput,
handleInputChange,
handleSubmit,
isLoading: chatEndpointIsLoading,
} = useChat({
api: endpoint,
id: id,
initialMessages: currentConversationMessages, // comes from indexedDB on page load
onResponse(response) {},
onFinish: async (message: Message) => {
console.log("the AI message:", message);
console.log("messages in onFinish", messages); // THIS IS EMPTY
},
}); The onFinish only provides the streamed AI response message, but at what point can I save the user message in IndexedDB? I tried doing it in my // send message when user clicks on Send button
async function sendMessage(e: FormEvent<HTMLFormElement>) {
e.preventDefault();
const name = input.substring(0, 30);
// 1. update the conversation name
const updatedConversations = conversations.map((c) =>
c.id === id
? {
...c,
name,
messages: [
...(c.messages ?? []),
{
id: generateId(),
role: "user",
content: input,
createdAt: new Date(),
} as Message,
],
}
: c,
);
// dispatch to store to update sidebar name
dispatch({
field: "conversations",
value: updatedConversations,
});
// 2. persist the conversation to indexedDB
await persistConversations(updatedConversations);
if (messageContainerRef.current) {
messageContainerRef.current.classList.add("grow");
}
if (!messages.length) {
await new Promise((resolve) => setTimeout(resolve, 300));
}
if (chatEndpointIsLoading) {
return;
}
handleSubmit(e);
} I also tried using a useEffect hook for when What is the recommended approach for persisting the FULL chat history for a conversation to somewhere like indexedDB? I am building essentially a ChatGPT clone so it has conversations in the sidebar and then you can click on one and see the full chat history. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
@djam90 The I'd also give the conversation a unique id (e.g., Note that I don't know offhand how to handle this for indexedDB, but there should be a way to achieve this approach. |
Beta Was this translation helpful? Give feedback.
-
Here's the recommended way to store messages using Ideally, you save the current generation to the database on the server, that way you avoid experiencing redundant/duplicate saves on the client. I'm also assuming you're saving the whole conversation during every generation, so the following should work: @/app/api/chat/route.tsimport { openai } from '@ai-sdk/openai';
import { streamText, convertToCoreMessages } from 'ai';
export async function POST(req: Request) {
const { id, messages } = await req.json();
const result = await streamText({
model: openai('gpt-4-turbo'),
messages: convertToCoreMessages(messages),
async onFinish({ text, toolCalls, toolResults, usage, finishReason }) {
// implement your own storage logic:
saveChat({
id,
messages: [...messages, {role: "assistant", content: text}],
})
},
});
return result.toAIStreamResponse();
} |
Beta Was this translation helpful? Give feedback.
-
and what about storing the message from the user side where role: "user"? The |
Beta Was this translation helpful? Give feedback.
Here's the recommended way to store messages using
useChat
in the route handler.Ideally, you save the current generation to the database on the server, that way you avoid experiencing redundant/duplicate saves on the client.
I'm also assuming you're saving the whole conversation during every generation, so the following should work:
@/app/api/chat/route.ts