This is a sample project for Python development with CDK.
The cdk.json
file tells the CDK Toolkit how to execute your app.
This project is set up like a standard Python project. The initialization
process also creates a virtualenv within this project, stored under the .venv
directory. To create the virtualenv it assumes that there is a python3
(or python
for Windows) executable in your path with access to the venv
package. If for any reason the automatic creation of the virtualenv fails,
you can create the virtualenv manually.
To manually create a virtualenv on MacOS and Linux:
$ python3 -m venv .venv
After the init process completes and the virtualenv is created, you can use the following step to activate your virtualenv.
$ source .venv/bin/activate
If you are a Windows platform, you would activate the virtualenv like this:
% .venv\Scripts\activate.bat
Once the virtualenv is activated, you can install the required dependencies.
(.venv) $ pip install -r requirements.txt
Before you deploy this project, you should create an Amazon S3 bucket to store your Apache Airflow Directed Acyclic Graphs (DAGs), custom plugins in a plugins.zip file, and Python dependencies in a requirements.txt file. Check this Create an Amazon S3 bucket for Amazon MWAA
$ aws s3 mb s3://your-s3-bucket-for-airflow-dag-code --region region-name $ aws s3api put-public-access-block --bucket your-s3-bucket-for-airflow-dag-code --public-access-block-configuration BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true $ aws s3api put-bucket-versioning --bucket your-s3-bucket-for-airflow-dag-code --versioning-configuration Status=Enabled $ aws s3api put-object --bucket your-s3-bucket-for-airflow-dag-code --key dags/ $ aws s3api put-object --bucket your-s3-bucket-for-airflow-dag-code --key requirements/requirements.txt
At this point you can now synthesize the CloudFormation template for this code.
(.venv) $ export CDK_DEFAULT_ACCOUNT=$(aws sts get-caller-identity --query Account --output text) (.venv) $ export CDK_DEFAULT_REGION=$(aws configure get region) (.venv) $ cdk -c s3_bucket_for_dag_code='your-s3-bucket-for-airflow-dag-code' \ -c airflow_env_name='your-airflow-env-name' \ synth --all
Use cdk deploy
command to create the stack shown above.
(.venv) $ cdk -c s3_bucket_for_dag_code='your-s3-bucket-for-airflow-dag-code' \ -c airflow_env_name='your-airflow-env-name' \ deploy --all
To add additional dependencies, for example other CDK libraries, just add
them to your setup.py
file and rerun the pip install -r requirements.txt
command.
Delete the CloudFormation stack by running the below command.
(.venv) $ cdk destroy --force --all
cdk ls
list all stacks in the appcdk synth
emits the synthesized CloudFormation templatecdk deploy
deploy this stack to your default AWS account/regioncdk diff
compare deployed stack with current statecdk docs
open CDK documentation
- Amazon MWAA - Networking
- Apache Airflow versions on Amazon Managed Workflows for Apache Airflow
- Amazon MWAA frequently asked questions
- Troubleshooting Amazon Managed Workflows for Apache Airflow
- Tutorials for Amazon Managed Workflows for Apache Airflow
- Best practices for Amazon Managed Workflows for Apache Airflow
- Code examples for Amazon Managed Workflows for Apache Airflow
- Orchestrate AWS Glue DataBrew jobs using Amazon Managed Workflows for Apache Airflow
- Code Repository: aws-mwaa-glue-databrew-nytaxi
- Amazon MWAA for Analytics Workshop
- Get started with Amazon Managed Workflows for Apache Airflow (MWAA)
- To update
requirements.txt
, run the commands like this:$ obj_version=$(aws s3api list-object-versions --bucket <i>your-s3-bucket-for-airflow-requirements</i> --prefix 'requirements/requirements.txt' | jq '.Versions[0].VersionId' | sed -e "s/\"//g") $ echo ${obj_version} $ aws mwaa update-environment \ --region <i>region-name</i> \ --name <i>your-airflow-environment</i> \ --requirements-s3-object-version ${obj_version}
- sample
requirements.txt
- ℹ️ MUST CHECK python package version at here: https://raw.githubusercontent.com/apache/airflow/constraints-2.0.2/constraints-3.7.txt
apache-airflow-providers-elasticsearch==1.0.3 apache-airflow-providers-redis==1.0.1 apache-airflow-providers-google==2.2.0 apache-airflow-providers-mysql==1.1.0
Enjoy!