You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The app can only invoke extension handlers (registered by plugins), which is suitable for simple tasks, such as inserting or updating data.
We need a mechanism to let plugin control over the entire LLM streaming and update app states, create a communication channels between app & plugins.
Cases: move streaming session & download progress retrieval to plugin
Solution
The event emitter mechanism helps to address more complex tasks, like managing the entire LLM stream or handling multiple bots replying.
This will also enable the ability to build Use-Case Plugins. With this capability, plugins can now control the entire LLM inference process, from receiving the message to updating it. For example, if the app sends a message, the plugin can build a chain of agents to process it and gradually update the message.
App now wrap the entire streaming process
Plugin should be able to control over the llm agent and sync states:
Use case based plugin:
The text was updated successfully, but these errors were encountered:
You can subscribe to NewMessageRequest events by defining a function to handle the event and registering it with the events object:
functionhandleMessageRequest(message: NewMessageRequest){// Your logic here. For example:// const response = openai.createChatCompletion({...})}functionregisterListener(){events.on(EventName.OnNewMessageRequest,handleMessageRequest);}// Register the listener function with the relevant extension points.exportfunctioninit({ register }){registerListener();}
In this example, we're defining a function called handleMessageRequest that takes a NewMessageRequest object as its argument. We're also defining a function called registerListener that registers the handleMessageRequest function as a listener for NewMessageRequest events using the on method of the events object.
functionhandleMessageRequest(data: NewMessageRequest){// Your logic here. For example:constresponse=openai.createChatCompletion({...})constmessage: NewMessageResponse={
...data,message: response.data.choices[0].message.content}// Now emit event so the app can display in the conversationevents.emit(EventName.OnNewMessageResponse,message)}
Problem
Solution
App now wrap the entire streaming process
Plugin should be able to control over the llm agent and sync states:
Use case based plugin:
The text was updated successfully, but these errors were encountered: