Example of setting up Python lambdas using Serverless Framework using serverless-python-requirements with common shared module without using lambda layers.
/
├─ services/ - services, each is a Serverless stack
│ ├─ first/ - stack with first lambda
│ ├─ pycommon/ - scripts reused in first and second modules
│ ├─ second/ - stack with second lambda
│ ... more Serverless stacks
When working with monorepo often we use some same code, configuration and libraries across different modules. Imports in Python can be tricky not as straightforward as for similar repository architecture when using NodeJS.
Some related threads found online:
Most answers advice to use Lambda Layers but this project show example how to do this without it.
It can be achieved using poetry (or pip) and with Vendor library directory feature of serverless-python-requirements. The trick is to:
- create common module, e.g: common
- for testing locally reference common module in poetry.toml
- to include files in the package uploaded to AWS add the common directory to vendor section of serverless.yml
Some libraries like numpy or pandas are OS specific and in order to work on AWS Lambda a project need to be build on Linux environment. So if you work on Mac like myself to make it work you need to add dockerizePip configuration to your serverless.yml
On local machine install:
- Poetry
- Docker
- Python3+
Install:
yarn install
Test:
yarn run test
Deploy:
yarn run deploy