Skip to content

Commit

Permalink
include message metadata in TestLLMWithStreaming
Browse files Browse the repository at this point in the history
  • Loading branch information
dlqqq committed Sep 23, 2024
1 parent 8b14533 commit febb8ef
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions packages/jupyter-ai-test/jupyter_ai_test/test_llms.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,10 +48,11 @@ def _stream(
run_manager: Optional[CallbackManagerForLLMRun] = None,
**kwargs: Any,
) -> Iterator[GenerationChunk]:
time.sleep(5)
time.sleep(1)
yield GenerationChunk(
text="Hello! This is a dummy response from a test LLM. I will now count from 1 to 100.\n\n"
text="Hello! This is a dummy response from a test LLM. I will now count from 1 to 20.\n\n",
generation_info={"test_metadata_field":"foobar"}
)
for i in range(1, 101):
for i in range(1, 21):
time.sleep(0.5)
yield GenerationChunk(text=f"{i}, ")

0 comments on commit febb8ef

Please sign in to comment.