You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Provider List: https://docs.litellm.ai/docs/providers
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.Traceback (most recent call last): File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\main.py", line 260, in completion model, custom_llm_provider, dynamic_api_key = get_llm_provider(model=model, custom_llm_provider=custom_llm_provider, api_base=api_base) File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\utils.py", line 1485, in get_llm_provider raise e File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\utils.py", line 1482, in get_llm_provider raise ValueError(f"LLM Provider NOT provided. Pass in the LLM provider you are trying to call. E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/{model}',..)` Learn more: https://docs.litellm.ai/docs/providers")ValueError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/ft:gpt-3.5-turbo-0613:xxx::xxx',..)` Learn more: https://docs.litellm.ai/docs/providersDuring handling of the above exception, another exception occurred:Traceback (most recent call last): File "E:\project_hub\babel\babel_gpt35_finetune\babel_finetune\alpha\demo3.py", line 13, in <module> response = completion( File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\utils.py", line 791, in wrapper raise e File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\utils.py", line 750, in wrapper result = original_function(*args, **kwargs) File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\timeout.py", line 53, in wrapper result = future.result(timeout=local_timeout_duration) File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\concurrent\futures\_base.py", line 458, in result return self.__get_result() File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\concurrent\futures\_base.py", line 403, in __get_result raise self._exception File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\timeout.py", line 42, in async_func return func(*args, **kwargs) File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\main.py", line 1188, in completion raise exception_type( File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\utils.py", line 3000, in exception_type raise e File "E:\Programming\anaconda\envs\babel_gpt35_finetune\lib\site-packages\litellm\utils.py", line 2982, in exception_type raise APIError(status_code=500, message=str(original_exception), llm_provider=custom_llm_provider, model=model)litellm.exceptions.APIError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. E.g. For 'Huggingface' inference endpoints pass in `completion(model='huggingface/ft:gpt-3.5-turbo-0613:xxx::xxx',..)` Learn more: https://docs.litellm.ai/docs/providers
### Twitter / LinkedIn details
_No response_
The text was updated successfully, but these errors were encountered:
Undertone0809
changed the title
[Bug]: recognizes openai finetune model as huggingface model when calling
[Bug]: recognizing openai finetune model as huggingface model when calling
Oct 16, 2023
I have solve this problem when I add parameter custom_llm_provider=openai. But I don't know the detail reason why I should add this provider. It seem openai finetune model can not find a llm_provider when calling get_llm_provider()
What happened?
A bug happened!
I would like to use openai finetune model but failed. It recognizes finetune model as huggingface model.
docs: https://docs.litellm.ai/docs/tutorials/finetuned_chat_gpt
Relevant log output
The text was updated successfully, but these errors were encountered: