You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Problem
The Observability AI Assistant currently has a setup step for creating an inference endpoint obs_ai_assistant_kb_inference. When created this will download .elser_model_2. This is required to use the knowledge base and adds complexity to the codebase.
Solution
8.17 ships with pre-configured inference endpoints for elser and e5 models (elastic/elasticsearch#116931). We should replace our own endpoint obs_ai_assistant_kb_inference with .elser-2-elasticsearch:
sorenlouv
changed the title
[Obs AI Assistant] Use built-in inference endpoint
[Obs AI Assistant] Use preconfigured elser inference endpoint
Nov 21, 2024
This has been postponed because the preconfigured endpoints don't allow us to adjust min_number_of_allocations. As soon as that is possible or we have a multi-tenant inference service we can pick this up again. In the meantime we'll be using custom inference endpoints.
Problem
The Observability AI Assistant currently has a setup step for creating an inference endpoint
obs_ai_assistant_kb_inference
. When created this will download.elser_model_2
. This is required to use the knowledge base and adds complexity to the codebase.Solution
8.17 ships with pre-configured inference endpoints for elser and e5 models (elastic/elasticsearch#116931). We should replace our own endpoint
obs_ai_assistant_kb_inference
with.elser-2-elasticsearch
:kibana/x-pack/plugins/observability_solution/observability_ai_assistant/server/service/kb_component_template.ts
Line 64 in f6ac2cf
Current setup UI
This screenshot shows the setup UI that we can get rid of if we use the preconfigured inference endpoint
The text was updated successfully, but these errors were encountered: