You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[X] respond with user conversation turn over turbo stream
[X] respond with empty form as well that is disabled
[X] onGenerateText updates conversation with user prompt and assistant
response. Enable the form on successful conversation update.
create conversation Turn component
Style edit page
Come up with something for new conversations
disable input on submit [3/3]
[X] disable the form on submit
[X] Move the onGenerateText event handler to the prompt form controller.
[X] On a successful conversation update, re-enable the form, and text input
add a spinner
Maybe this could be a placeholder component that is removed with the turbo
stream response
format content returned by llm
fix unable to submit prompt after validation error
This form is still disabled I think
delete conversation
for those conversations not linked to a memo
add copy button to assistant response
add coversation settings (eg, temperature, system prompt etc)
preserve advanced options collapse state
add conversation title [7/7]
[X] Show the title at the top in a fixed container.
[X] Show form on click of edit icon (hide field) and put focus on input
[X] Hide form and show field when input loses focus
[X] Extract partial
[X] On update, render turbo stream partial
[X] add title to conversaions table
[X] set title default based on first message
refactor [11/11]
[X] add response jsonb field to generate_text_requests
[X] update generate_text_request record with the full response.
This would be done in the GenerateTextJob
[X] when displaying the turns, do so from the generate text requests
associated to the conversation.
Conversations has an turns method that maps each message or response to a
Conversation::Turn object. Update this method to map each
generate_text_request, in order by created_at, prompt and response. Use
the helper method for the prompt. Use helper method on the
response.content. Wrap the json blob in a InovkeModelResponse obj.
[X] refactor the concept of an exchange from the conversation jsonb field to
being constructed from the generate_text_requests.
[X] Migrate existing conversation exchange to the associated
generate_text_requests objects. Stub the token counts and what not.
[X] Remove code that updates the conversation from the conversations view.
[X] Can we remove the code that creates the conversation too? And ditch that
form object?
[X] Remove the code that updates the conversation from the memo feature.
[X] Consolidate memo conversation controller with conversations controller
[X] Drop the exchange column and remove the exchange attr
Remove conversation::turn classes [8/8]
[X] Use the gtrs in the conversation view. Refactor the
conversation_turn_component to get the data from gtr. It will render both
the user and assistant response
[X] Add the concept of pending_response to gtr. Use that to determine when
to show the spinner
[X] Implement a to_message_turn method on gtr that will return a tuple of
properly formatted user and assistant hashes that will be serialied for the
http request
[X] For the exchange loop over the gtrs and call to_message_turn to produce
the tuple of user, assistant response
[X] Delete the conversaion::turn objects
[X] Add enum to gtr (pending_response, complete, error)
[X] When the status is error, show an error message where the content would
have been
[X] When the generate text job errors, update the gtr to error and broadcast
the component. Do this in a exhausted retries block
move the delete button out of the form slot
redirect to edit view on first generate text request response with a disabled form [11/11]
[X] add accepts_nested_attributes_for generate_text_request to conversation
[X] Change the prompt form component to the conversation form component.
Most of the fields will be for the generate text request
[X] Submit the form to the conversation post/put endpoints
[X] Redirect to conversation edit on create. Enqueue the GenerateTextJob
[X] Render the form and conversation turn components from the update action.
Enqueue the GenerateTextJob
[X] Remove rendering the turbo streams from the generate text requests controller
[X] Remove the hack that sets the browser history state
[X] Make sure the title is editable
[X] Use the show_options query param
[X] Make sure generating text still works for memos
[X] Fix bug where creating memo enqueues two GenerateTextJobs
fix bug where button is still disabled after submitting on edit
turn meta data
Show model, preset, temp, token count
[X] Show info icon in the tray of the assistant response.
[X] On click, show the details
show token count for entire conversation
Add model, temp and preset to query params after create redirect
Update total token count on successful generate requests
extract gentext generator to class
delete assistant response
extract conversation component
This is used for both the new and edit actions
move flash messages to it’s own stream
lib specs
request specs [2/2]
[X] finish conversation spec
[X] generate text requests
view component specs
view component browser specs
feature specs [4/4]
[X] fix memos
[X] create Conversation
[X] update conversation
[X] delete conversation
custom presets [10/10]
[X] Add preset_type enum to generate_text_presets table (default, custom)
[X] Add join table users_presets. belongs to user and preset
[X] Add route and CRUD actions
[X] Add the views
[X] Make and name your own presets
[X] Link to new preset from conversation
[X] Cache a reference to the conversation it was linked from so when
creating it redirects back to the conversation
[X] Make custom presets available in the preset drop down
[X] on redirect to conversations set the new preset selected option