-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Asking for OPENAI_API_KEY for non-OpenAI LLMs #1414
Comments
Hey @dhruv-anand-aintech do you see this when setting the perplexity api key? import os
os.environ["PERPLEXITYAI_API_KEY"] = "..."
response = completion(
model="perplexity/mistral-7b-instruct",
messages=messages
)
print(response) |
not able to repro this on my end |
Hmmm, looks like my load_dotenv hasn't worked in the notebook I'm using. It'll still be useful for the error to say PERPLEXITYAI_API_KEY not set. Thanks for debugging! |
great - closing this issue then |
This is showing up again: https://gist.github.com/dhruv-anand-aintech/4d1a8fd4a27062de41e5449e91880d66 |
Opened a PR here for this @dhruv-anand-aintech #1765 |
One more PR #1776 |
What happened?
Tried using perplexity's llm.
Got this error:
'The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable'
Relevant log output
Twitter / LinkedIn details
https://twitter.com/dhruv___anand
The text was updated successfully, but these errors were encountered: