Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Node Serverless deployment fails due to bundling issue #1110

Closed
SyedAli00896 opened this issue Aug 9, 2024 · 7 comments
Closed

Node Serverless deployment fails due to bundling issue #1110

SyedAli00896 opened this issue Aug 9, 2024 · 7 comments
Labels
bug Something isn't working

Comments

@SyedAli00896
Copy link

Hi, i am using llamaindex library in my aws nodejs serverless project. It is working fine in development mode, but when I tried to deploy it, it started giving me errors in bundling of code. Here are the errors:

`
✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/darwin/arm64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8:
  9 │ require(`../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node`);
    ╵         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/linux/x64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8:
  9 │ require(`../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node`);
    ╵         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/darwin/x64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8:
  9 │ require(`../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node`);
    ╵         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/linux/arm64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8:
  9 │ require(`../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node`);
    ╵         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/win32/arm64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8:
  9 │ require(`../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node`);
    ╵         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

✘ [ERROR] No loader is configured for ".node" files: node_modules/onnxruntime-node/bin/napi-v3/win32/x64/onnxruntime_binding.node

node_modules/onnxruntime-node/dist/binding.js:9:8:
  9 │ require(`../bin/napi-v3/${process.platform}/${process.arch}/onnxruntime_binding.node`);

`

For reference, I'm using serverless-esbuild plugin for code bundling, i've tried serverless-webpack also.
Here are the library versions from my package.json file:

"serverless": "^3.31.0",
"serverless-esbuild": "^1.48.5",
"llamaindex": "^0.5.12",

@himself65
Copy link
Member

how about make this external package?

@himself65 himself65 added the bug Something isn't working label Aug 9, 2024
@SyedAli00896
Copy link
Author

how about make this external package?

In that scenario, the size limit for lambda gets exceeded. Also, the library doesn't seems to support treeshaking unfortunately.

@himself65
Copy link
Member

What's your use case in this scenario? what modules are you using

@himself65
Copy link
Member

add this to your esbuild config

evanw/esbuild#1051 (comment)

@SyedAli00896
Copy link
Author

SyedAli00896 commented Aug 10, 2024

add this to your esbuild config

evanw/esbuild#1051 (comment)

Thanks. The solution you provided worked and I successfully deployed the code to AWS. However when I tried to run the lambda, its getting failed with status "502Bad Gateway".
Upon checking the logs, its seems to be an issue with tiktoken library, here are the logs:

2024-08-10T20:26:29.549Z undefined ERROR Uncaught Exception { "errorType": "Error", "errorMessage": "Missing tiktoken_bg.wasm", "stack": [ "Error: Missing tiktoken_bg.wasm", " at node_modules/tiktoken/tiktoken.cjs (/var/task/src/service/ai/chat/message.js:11203:13)", " at __require2 (/var/task/src/service/ai/chat/message.js:18:53)", " at node_modules/@llamaindex/core/node_modules/@llamaindex/env/dist/tokenizers/node.js (/var/task/src/service/ai/chat/message.js:11218:31)", " at __init (/var/task/src/service/ai/chat/message.js:15:59)", " at node_modules/@llamaindex/core/node_modules/@llamaindex/env/dist/index.js (/var/task/src/service/ai/chat/message.js:11301:5)", " at __init (/var/task/src/service/ai/chat/message.js:15:59)", " at node_modules/@llamaindex/core/dist/global/index.js (/var/task/src/service/ai/chat/message.js:11344:5)", " at __init (/var/task/src/service/ai/chat/message.js:15:59)", " at Object.<anonymous> (/var/task/src/service/ai/chat/message.js:582537:1)", " at Module._compile (node:internal/modules/cjs/loader:1364:14)" ] }

When I tried to use make tiktoken as external dependancy, I got the following error in my lambda:

2024-08-10T20:42:57.168Z undefined ERROR Uncaught Exception { "errorType": "Runtime.ImportModuleError", "errorMessage": "Error: Cannot find module 'tiktoken'\nRequire stack:\n- /var/task/src/service/ai/chat/message.js\n- /var/runtime/index.mjs", "stack": [ "Runtime.ImportModuleError: Error: Cannot find module 'tiktoken'", "Require stack:", "- /var/task/src/service/ai/chat/message.js", "- /var/runtime/index.mjs", " at _loadUserApp (file:///var/runtime/index.mjs:1087:17)", " at async UserFunction.js.module.exports.load (file:///var/runtime/index.mjs:1119:21)", " at async start (file:///var/runtime/index.mjs:1282:23)", " at async file:///var/runtime/index.mjs:1288:1" ] }

@marcusschiesser
Copy link
Collaborator

@SyedAli00896 we had a similar issue with NextJS deployment on Vercel, see run-llama/create-llama#164 (was fixed by copying the WASM file; see https://github.com/run-llama/create-llama/pull/201/files).
As NextJS/Vercel is based on AWS NodeJS serverless, to my knowledge, copying the WASM should help, too.

@marcusschiesser
Copy link
Collaborator

fixed in latest 0.5.20 (use withLlamaIndex)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants