Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Asking for OPENAI_API_KEY for non-OpenAI LLMs #1414

Closed
dhruv-anand-aintech opened this issue Jan 11, 2024 · 7 comments
Closed

[Bug]: Asking for OPENAI_API_KEY for non-OpenAI LLMs #1414

dhruv-anand-aintech opened this issue Jan 11, 2024 · 7 comments
Assignees
Labels
bug Something isn't working

Comments

@dhruv-anand-aintech
Copy link
Contributor

What happened?

Tried using perplexity's llm.

messages = [{ "content": "What's the latest news about Israel-Palestine war?","role": "user"}]
litellm.completion(model="perplexity/pplx-7b-online", messages=messages)

Got this error:
'The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable'

Relevant log output

{
	"name": "APIError",
	"message": "OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable",
	"stack": "---------------------------------------------------------------------------
OpenAIError                               Traceback (most recent call last)
File ~/miniforge3/lib/python3.10/site-packages/litellm/llms/openai.py:344, in OpenAIChatCompletion.completion(self, model_response, timeout, model, messages, print_verbose, api_key, api_base, acompletion, logging_obj, optional_params, litellm_params, logger_fn, headers, custom_prompt_dict, client)
    343             else:
--> 344                 raise e
    345 except OpenAIError as e:

File ~/miniforge3/lib/python3.10/site-packages/litellm/llms/openai.py:288, in OpenAIChatCompletion.completion(self, model_response, timeout, model, messages, print_verbose, api_key, api_base, acompletion, logging_obj, optional_params, litellm_params, logger_fn, headers, custom_prompt_dict, client)
    287 if client is None:
--> 288     openai_client = OpenAI(
    289         api_key=api_key,
    290         base_url=api_base,
    291         http_client=litellm.client_session,
    292         timeout=timeout,
    293         max_retries=max_retries,
    294     )
    295 else:

File ~/miniforge3/lib/python3.10/site-packages/openai/_client.py:99, in OpenAI.__init__(self, api_key, organization, base_url, timeout, max_retries, default_headers, default_query, http_client, _strict_response_validation)
     98 if api_key is None:
---> 99     raise OpenAIError(
    100         \"The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable\"
    101     )
    102 self.api_key = api_key

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

During handling of the above exception, another exception occurred:

OpenAIError                               Traceback (most recent call last)
File ~/miniforge3/lib/python3.10/site-packages/litellm/main.py:748, in completion(model, messages, timeout, temperature, top_p, n, stream, stop, max_tokens, presence_penalty, frequency_penalty, logit_bias, user, response_format, seed, tools, tool_choice, logprobs, top_logprobs, deployment_id, functions, function_call, base_url, api_version, api_key, model_list, **kwargs)
    742     logging.post_call(
    743         input=messages,
    744         api_key=api_key,
    745         original_response=str(e),
    746         additional_args={\"headers\": headers},
    747     )
--> 748     raise e
    750 if optional_params.get(\"stream\", False):
    751     ## LOGGING

File ~/miniforge3/lib/python3.10/site-packages/litellm/main.py:723, in completion(model, messages, timeout, temperature, top_p, n, stream, stop, max_tokens, presence_penalty, frequency_penalty, logit_bias, user, response_format, seed, tools, tool_choice, logprobs, top_logprobs, deployment_id, functions, function_call, base_url, api_version, api_key, model_list, **kwargs)
    722 try:
--> 723     response = openai_chat_completions.completion(
    724         model=model,
    725         messages=messages,
    726         headers=headers,
    727         model_response=model_response,
    728         print_verbose=print_verbose,
    729         api_key=api_key,
    730         api_base=api_base,
    731         acompletion=acompletion,
    732         logging_obj=logging,
    733         optional_params=optional_params,
    734         litellm_params=litellm_params,
    735         logger_fn=logger_fn,
    736         timeout=timeout,
    737         custom_prompt_dict=custom_prompt_dict,
    738         client=client,  # pass AsyncOpenAI, OpenAI client
    739     )
    740 except Exception as e:
    741     ## LOGGING - log the original exception returned

File ~/miniforge3/lib/python3.10/site-packages/litellm/llms/openai.py:352, in OpenAIChatCompletion.completion(self, model_response, timeout, model, messages, print_verbose, api_key, api_base, acompletion, logging_obj, optional_params, litellm_params, logger_fn, headers, custom_prompt_dict, client)
    351 else:
--> 352     raise OpenAIError(status_code=500, message=str(e))

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

During handling of the above exception, another exception occurred:

APIError                                  Traceback (most recent call last)
Cell In[11], line 2
      1 messages = [{ \"content\": \"What's the latest news about Israel-Palestine war?\",\"role\": \"user\"}]
----> 2 litellm.completion(model=\"perplexity/pplx-7b-online\", messages=messages)

File ~/miniforge3/lib/python3.10/site-packages/litellm/utils.py:2130, in client.<locals>.wrapper(*args, **kwargs)
   2126         if (
   2127             liteDebuggerClient and liteDebuggerClient.dashboard_url != None
   2128         ):  # make it easy to get to the debugger logs if you've initialized it
   2129             e.message += f\"\
 Check the log in your dashboard - {liteDebuggerClient.dashboard_url}\"
-> 2130 raise e

File ~/miniforge3/lib/python3.10/site-packages/litellm/utils.py:2037, in client.<locals>.wrapper(*args, **kwargs)
   2035                     return cached_result
   2036 # MODEL CALL
-> 2037 result = original_function(*args, **kwargs)
   2038 end_time = datetime.datetime.now()
   2039 if \"stream\" in kwargs and kwargs[\"stream\"] == True:
   2040     # TODO: Add to cache for streaming

File ~/miniforge3/lib/python3.10/site-packages/litellm/main.py:1746, in completion(model, messages, timeout, temperature, top_p, n, stream, stop, max_tokens, presence_penalty, frequency_penalty, logit_bias, user, response_format, seed, tools, tool_choice, logprobs, top_logprobs, deployment_id, functions, function_call, base_url, api_version, api_key, model_list, **kwargs)
   1743     return response
   1744 except Exception as e:
   1745     ## Map to OpenAI Exception
-> 1746     raise exception_type(
   1747         model=model,
   1748         custom_llm_provider=custom_llm_provider,
   1749         original_exception=e,
   1750         completion_kwargs=args,
   1751     )

File ~/miniforge3/lib/python3.10/site-packages/litellm/utils.py:6628, in exception_type(model, original_exception, custom_llm_provider, completion_kwargs)
   6626 # don't let an error with mapping interrupt the user from receiving an error from the llm api calls
   6627 if exception_mapping_worked:
-> 6628     raise e
   6629 else:
   6630     raise original_exception

File ~/miniforge3/lib/python3.10/site-packages/litellm/utils.py:5632, in exception_type(model, original_exception, custom_llm_provider, completion_kwargs)
   5630     else:
   5631         exception_mapping_worked = True
-> 5632         raise APIError(
   5633             status_code=original_exception.status_code,
   5634             message=f\"OpenAIException - {original_exception.message}\",
   5635             llm_provider=\"openai\",
   5636             model=model,
   5637             request=original_exception.request,
   5638         )
   5639 else:
   5640     # if no status code then it is an APIConnectionError: https://github.com/openai/openai-python#handling-errors
   5641     raise APIConnectionError(
   5642         __cause__=original_exception.__cause__,
   5643         llm_provider=custom_llm_provider,
   5644         model=model,
   5645         request=original_exception.request,
   5646     )

APIError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable"
}

Twitter / LinkedIn details

https://twitter.com/dhruv___anand

@dhruv-anand-aintech dhruv-anand-aintech added the bug Something isn't working label Jan 11, 2024
@krrishdholakia
Copy link
Contributor

Hey @dhruv-anand-aintech do you see this when setting the perplexity api key?

import os 

os.environ["PERPLEXITYAI_API_KEY"] = "..."

response = completion(
    model="perplexity/mistral-7b-instruct", 
    messages=messages
)
print(response)

@krrishdholakia
Copy link
Contributor

not able to repro this on my end

@krrishdholakia krrishdholakia self-assigned this Jan 11, 2024
@dhruv-anand-aintech
Copy link
Contributor Author

Hmmm, looks like my load_dotenv hasn't worked in the notebook I'm using.

It'll still be useful for the error to say PERPLEXITYAI_API_KEY not set.

Thanks for debugging!

@krrishdholakia
Copy link
Contributor

great - closing this issue then

@dhruv-anand-aintech
Copy link
Contributor Author

@ishaan-jaff ishaan-jaff reopened this Jan 31, 2024
@ishaan-jaff
Copy link
Contributor

Opened a PR here for this @dhruv-anand-aintech #1765

@ishaan-jaff
Copy link
Contributor

One more PR #1776

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants