Skip to content

Commit

Permalink
fix(js): Make llm.system and llm.provider available from LLMAttribute…
Browse files Browse the repository at this point in the history
…Postfixes (#1135)
  • Loading branch information
cephalization authored Nov 25, 2024
1 parent 09e702f commit 710d1d3
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 2 deletions.
5 changes: 5 additions & 0 deletions js/.changeset/honest-chefs-hang.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
"@arizeai/openinference-semantic-conventions": minor
---

Add llm.system and llm.provider to LLMAttributePostfixes record
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ export const SemanticAttributePrefixes = {
} as const;

export const LLMAttributePostfixes = {
provider: "provider",
system: "system",
model_name: "model_name",
token_count: "token_count",
input_messages: "input_messages",
Expand Down Expand Up @@ -165,12 +167,13 @@ export const LLM_MODEL_NAME =
* The provider of the inferences. E.g. the cloud provider
*/
export const LLM_PROVIDER =
`${SemanticAttributePrefixes.llm}.provider` as const;
`${SemanticAttributePrefixes.llm}.${LLMAttributePostfixes.provider}` as const;

/**
* The AI product as identified by the client or server
*/
export const LLM_SYSTEM = `${SemanticAttributePrefixes.llm}.system` as const;
export const LLM_SYSTEM =
`${SemanticAttributePrefixes.llm}.${LLMAttributePostfixes.system}` as const;

/** Token count for the completion by the llm */
export const LLM_TOKEN_COUNT_COMPLETION =
Expand Down

0 comments on commit 710d1d3

Please sign in to comment.