Skip to content

Commit

Permalink
intermediate checkin
Browse files Browse the repository at this point in the history
  • Loading branch information
Ilyin committed Jun 13, 2021
1 parent 8b659f2 commit 7f5fa31
Show file tree
Hide file tree
Showing 14 changed files with 254 additions and 122 deletions.
40 changes: 36 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -915,7 +915,8 @@ aws cloudformation create-stack \
ParameterKey=EnvName,ParameterValue=$ENV_NAME \
ParameterKey=EnvType,ParameterValue=dev \
ParameterKey=AvailabilityZones,ParameterValue=${AWS_DEFAULT_REGION}a\\,${AWS_DEFAULT_REGION}b \
ParameterKey=NumberOfAZs,ParameterValue=2
ParameterKey=NumberOfAZs,ParameterValue=2 \
ParameterKey=SeedCodeS3BucketName,ParameterValue=$S3_BUCKET_NAME
```

If you would like to use **multi-account model deployment**, you must provide the valid values for OU ids and the name for the `SetupStackSetExecutionRole`:
Expand All @@ -938,11 +939,14 @@ aws cloudformation create-stack \
ParameterKey=AvailabilityZones,ParameterValue=${AWS_DEFAULT_REGION}a\\,${AWS_DEFAULT_REGION}b \
ParameterKey=NumberOfAZs,ParameterValue=2 \
ParameterKey=StartKernelGatewayApps,ParameterValue=YES \
ParameterKey=SeedCodeS3BucketName,ParameterValue=$S3_BUCKET_NAME \
ParameterKey=OrganizationalUnitStagingId,ParameterValue=$STAGING_OU_ID \
ParameterKey=OrganizationalUnitProdId,ParameterValue=$PROD_OU_ID \
ParameterKey=SetupStackSetExecutionRoleName,ParameterValue=$SETUP_STACKSET_ROLE_NAME
```

If you do not have an AWS Organization setup, you can omit the `OrganizationalUnitStagingId` and `OrganizationalUnitProdId` parameters from the previous call.

### Cleanup
First, delete the two root stacks from AWS CloudFormation console or command line:
```bash
Expand Down Expand Up @@ -974,6 +978,7 @@ aws cloudformation describe-stacks \

Copy and paste the `AssumeDSAdministratorRole` link to a web browser and switch role to DS Administrator.
Go to AWS Service Catalog in the AWS console and select **Products** on the left pane:

![service-catalog-end-user-products](img/service-catalog-end-user-products.png)

You will see the list of available products for your user role:
Expand All @@ -984,7 +989,7 @@ Click on the product name and and then on the **Launch product** on the product

![service-catalog-launch-product](img/service-catalog-launch-product.png)

Fill the product parameters with values specific for your environment. Provide the valid values for OU ids and the name for the `SetupStackSetExecutionRole` if you would like to enable multi-account model deployment.
Fill the product parameters with values specific for your environment. Provide the valid values for OU ids and the name for the `SetupStackSetExecutionRole` if you would like to enable multi-account model deployment, otherwise keep these parameters empty.

Wait until AWS Service Catalog finishes the provisioning of the Data Science environment stack and the product status becomes **Available**. The data science environmetn provisioning takes about 20 minutes to complete.

Expand Down Expand Up @@ -1145,6 +1150,26 @@ aws cloudformation deploy \
--capabilities CAPABILITY_NAMED_IAM
```

Deploy the setup stack set execution role in each of the staging and target accounts. This step is only needed if:
1. You are going to use multi-account model deployment option
2. You want that the deployment of the data science environment provisions the network infrastructure and IAM roles in the target accounts.

```sh
ENV_NAME=ds-team
ADMIN_ACCOUNT_ID=#Data science account with SageMaker Studio
SETUP_STACKSET_ROLE_NAME=$ENV_NAME-setup-stackset-role

aws cloudformation deploy \
--template-file build/$AWS_DEFAULT_REGION/env-iam-setup-stackset-role.yaml \
--stack-name $ENV_NAME-setup-stackset-execution-role \
--capabilities CAPABILITY_NAMED_IAM \
--parameter-overrides \
EnvName=$ENV_NAME \
EnvType=$ENV_TYPE \
StackSetExecutionRoleName=$SETUP_STACKSET_ROLE_NAME \
AdministratorAccountId=$ADMIN_ACCOUNT_ID
```

Deploy IAM shared roles:
```bash
STACK_SET_NAME=ds-team
Expand Down Expand Up @@ -1172,6 +1197,8 @@ aws cloudformation deploy \
EnvType=dev
```

If you want to provision the target account infrastructure during the data science environment deployment, you must provide the value for the `SetupStackSetExecutionRoleName` parameter.

Deploy SageMaker model deployment roles in development, staging, and production AWS accounts:
```bash
aws cloudformation deploy \
Expand All @@ -1181,11 +1208,15 @@ aws cloudformation deploy \
--parameter-overrides \
EnvName=$ENV_NAME \
EnvType=dev \
PipelineExecutionRoleArn=arn:aws:iam::ACCOUNT_ID:role/service-role/AmazonSageMakerServiceCatalogProductsUseRole \
PipelineExecutionRoleArn=arn:aws:iam::ACCOUNT_ID:role/service-role/
AmazonSageMakerServiceCatalogProductsUseRole \
AdministratorAccountId=<DATA SCIENCE ACCOUNT ID> \
ModelS3KMSKeyArn=<AWS KMS Key for S3 model bucket> \
ModelBucketName=<S3 Model bucket name>
```

If you do not use multi-account deployment, you do not need to deploy these roles into the staging and production accounts. Deploy to the development account only.

Show IAM role ARNs:
```bash
aws cloudformation describe-stacks \
Expand Down Expand Up @@ -1271,7 +1302,8 @@ aws cloudformation create-stack \
ParameterKey=CreateVPCFlowLogsToCloudWatch,ParameterValue=NO \
ParameterKey=CreateVPCFlowLogsRole,ParameterValue=NO \
ParameterKey=AvailabilityZones,ParameterValue=${AWS_DEFAULT_REGION}a\\,${AWS_DEFAULT_REGION}b\\,${AWS_DEFAULT_REGION}c \
ParameterKey=NumberOfAZs,ParameterValue=3
ParameterKey=NumberOfAZs,ParameterValue=3 \
ParameterKey=SeedCodeS3BucketName,ParameterValue=$S3_BUCKET_NAME
```

## Clean-up
Expand Down
12 changes: 6 additions & 6 deletions cfn_templates/core-iam-sc-sm-projects-roles.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -382,12 +382,12 @@ Resources:
- cloudformation:TagResource
Resource: "*"
Effect: Allow
-
Action:
- organizations:DescribeOrganizationalUnit
- organizations:ListAccountsForParent
Resource: "arn:aws:organizations::*:ou/o-*/ou-*"
Effect: Allow
#-
# Action:
# - organizations:DescribeOrganizationalUnit
# - organizations:ListAccountsForParent
# Effect: Allow
# Resource: "arn:aws:organizations::*:ou/o-*/ou-*"
-
Action:
- cloudwatch:PutMetricData
Expand Down
11 changes: 11 additions & 0 deletions cfn_templates/data-science-environment-quickstart.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,10 @@ Metadata:
default: Deployment Options
Parameters:
- CreateSharedServices
- Label:
default: S3 Bucket Name with MLOps Seed Code
Parameters:
- SeedCodeS3BucketName
- Label:
default: Network Configuration
Parameters:
Expand All @@ -47,6 +51,8 @@ Metadata:
default: Environment type
CreateSharedServices:
default: Create Shared Services (PyPI mirror)
SeedCodeS3BucketName:
default: Existing S3 bucket name where MLOps seed code will be stored
VPCCIDR:
default: VPC CIDR block
PrivateSubnet1ACIDR:
Expand Down Expand Up @@ -94,6 +100,10 @@ Parameters:
- 'NO'
Description: Set to YES if you do want to provision the shared services VPC network and PyPi mirror repository

SeedCodeS3BucketName:
Description: S3 bucket name to store MLOps seed code (the S3 bucket must exist)
Type: String

VPCCIDR:
AllowedPattern: ^(([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])\.){3}([0-9]|[1-9][0-9]|1[0-9]{2}|2[0-4][0-9]|25[0-5])(\/(1[6-9]|2[0-8]))$
ConstraintDescription: CIDR block parameter must be in the form x.x.x.x/16-28
Expand Down Expand Up @@ -194,6 +204,7 @@ Resources:
PrivateSubnet2ACIDR: !Ref PrivateSubnet2ACIDR
PublicSubnet1CIDR: !Ref PublicSubnet1CIDR
PublicSubnet2CIDR: !Ref PublicSubnet2CIDR
SeedCodeS3BucketName: !Ref SeedCodeS3BucketName
TemplateURL: env-main.yaml
Tags:
- Key: EnvironmentName
Expand Down
Loading

0 comments on commit 7f5fa31

Please sign in to comment.