Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WebLLM cannot load new models #6678

Closed
5 tasks done
alexander-fischer opened this issue Sep 2, 2024 · 1 comment · Fixed by #6696
Closed
5 tasks done

WebLLM cannot load new models #6678

alexander-fischer opened this issue Sep 2, 2024 · 1 comment · Fixed by #6696
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@alexander-fischer
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import * as webllm from "@mlc-ai/web-llm"

const MODEL = "gemma-2-2b-it-q4f16_1-MLC"

private async loadLLM() {
const initProgressCallback = (report: webllm.InitProgressReport) => {
console.log(report.text)
}

const createdEngine = await webllm.CreateMLCEngine(MODEL, {
    initProgressCallback: initProgressCallback,
})
return createdEngine

}

Error Message and Stack Trace (if applicable)

Uncaught (in promise) Error: Failed to load model: No URL found for model ID "gemma-2-2b-it-q4f16_1-MLC". Please check if the model ID is correct and included in the model_list configuration.
at findModelRecord (index.js:7479:23)
at MLCEngine.eval (index.js:7481:33)
at Generator.next ()
at fulfilled (index.js:1279:56)Understand this error

Description

I use langchain.js together with @mlc-ai/web-llm. New models cannot be loaded because langchain-community package requires an old version of web-llm.

I would recommend to update the peerDependency to 0.2.62.

System Info

"@langchain/community": "^0.2.31"
"@langchain/langgraph": "^0.1.9"
"@mlc-ai/web-llm": "^0.2.46"

platform: Mac
node version: 22.4.1

Copy link

dosubot bot commented Sep 2, 2024

The issue with loading the model "gemma-2-2b-it-q4f16_1-MLC" is likely because this model ID is not listed in the available model records. You can check the list of available model records here: https://github.com/mlc-ai/web-llm/blob/main/src/config.ts [1].

Additionally, the current version of the @mlc-ai/web-llm package specified in the package.json file of the LangChain.js repository is 0.2.46 [2]. If you believe that updating to a newer version (e.g., 0.2.62) would resolve the issue, you might consider updating the dependency in your project. However, ensure that the model ID you are trying to use is supported in the newer version.

To continue talking to Dosu, mention @dosu.

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Sep 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant