Skip to content

Commit

Permalink
fix: unused imports, unused context, typos (#4)
Browse files Browse the repository at this point in the history
Various tidy up, unused imports, unused context, typos
  • Loading branch information
quixoticmonk authored Sep 5, 2024
1 parent acbbea3 commit 8eeb9df
Show file tree
Hide file tree
Showing 12 changed files with 84 additions and 82 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,5 @@ terraform.rc
settings.json
TODO.md
.DS_Store
.idea
.venv
8 changes: 4 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ For best practices and information on developing with Terraform, see the [I&A Mo

## Contributing Code

In order to contibute code to this repository, you must submit a *[Pull Request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request)*. To do so, you must *[fork](https://docs.github.com/en/get-started/quickstart/fork-a-repo)* this repostiory, make your changes in your forked version and submit a *Pull Request*.
In order to contribute code to this repository, you must submit a *[Pull Request](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request)*. To do so, you must *[fork](https://docs.github.com/en/get-started/quickstart/fork-a-repo)* this repository, make your changes in your forked version and submit a *Pull Request*.

## Writing Documentation

Expand All @@ -20,7 +20,7 @@ README.md is automatically generated by pulling in content from other files. For

Pull Requests (PRs) submitted against this repository undergo a series of static and functional checks.

> :exclamation: Note: Failures during funtional or static checks will prevent a pull request from being accepted.
> :exclamation: Note: Failures during functional or static checks will prevent a pull request from being accepted.
It is a best practice to perform these checks locally prior to submitting a pull request.

Expand All @@ -37,15 +37,15 @@ TIPS: **do not** modify the `./project_automation/{test-name}/entrypoint.sh`, in
- Checkov
- Terratest

> :bangbang: The readme.md file will be created after all checks have completed successfuly, it is recommended that you install terraform-docs locally in order to preview your readme.md file prior to publication.
> :bangbang: The readme.md file will be created after all checks have completed successfully, it is recommended that you install terraform-docs locally to preview your README.md file prior to publication.
## Install the required tools

Prerequisites:

- [Python](https://docs.python.org/3/using/index.html)
- [Pip](https://pip.pypa.io/en/stable/installation/)
- [golang](https://go.dev/doc/install) (for macos you can use `brew`)
- [golang](https://go.dev/doc/install) (for macOS you can use `brew`)
- [tflint](https://github.com/terraform-linters/tflint)
- [tfsec](https://aquasecurity.github.io/tfsec/v1.0.11/)
- [Markdown Lint](https://github.com/markdownlint/markdownlint)
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -190,7 +190,7 @@ To use this module you need have the following:
| <a name="input_lambda_python_runtime"></a> [lambda\_python\_runtime](#input\_lambda\_python\_runtime) | Lambda Python runtime | `string` | `"python3.11"` | no |
| <a name="input_lambda_reserved_concurrency"></a> [lambda\_reserved\_concurrency](#input\_lambda\_reserved\_concurrency) | Maximum Lambda reserved concurrency, make sure your AWS quota is sufficient | `number` | `100` | no |
| <a name="input_name_prefix"></a> [name\_prefix](#input\_name\_prefix) | Name to be used on all the resources as identifier. | `string` | `"runtask-tf-plan-analyzer"` | no |
| <a name="input_recovery_window"></a> [recovery\_window](#input\_recovery\_window) | Numbers of day Number of days that AWS Secrets Manager waits before it can delete the secret | `number` | `0` | no |
| <a name="input_recovery_window"></a> [recovery\_window](#input\_recovery\_window) | Number of days that AWS Secrets Manager waits before it can delete the secret | `number` | `0` | no |
| <a name="input_run_task_iam_roles"></a> [run\_task\_iam\_roles](#input\_run\_task\_iam\_roles) | List of IAM roles to be attached to the Lambda function | `list(string)` | `null` | no |
| <a name="input_runtask_stages"></a> [runtask\_stages](#input\_runtask\_stages) | List of all supported run task stages | `list(string)` | <pre>[<br> "pre_plan",<br> "post_plan",<br> "pre_apply"<br>]</pre> | no |
| <a name="input_tags"></a> [tags](#input\_tags) | Map of tags to apply to resources deployed by this solution. | `map(any)` | `null` | no |
Expand Down
3 changes: 1 addition & 2 deletions lambda/runtask_callback/handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,6 @@
import re
from urllib.request import urlopen, Request
from urllib.error import HTTPError, URLError
from urllib.parse import urlencode

HCP_TF_HOST_NAME = os.environ.get("HCP_TF_HOST_NAME", "app.terraform.io")

Expand All @@ -32,7 +31,7 @@
logger.info("Log level set to %s" % logger.getEffectiveLevel())


def lambda_handler(event, context):
def lambda_handler(event, _):
logger.debug(json.dumps(event))
try:
# trim empty url from the payload
Expand Down
10 changes: 6 additions & 4 deletions lambda/runtask_edge/handler.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
import hashlib
import base64
import os
import hashlib
import json
from urllib.parse import parse_qs, urlencode
import logging
import os

logger = logging.getLogger()
log_level = os.environ.get("log_level", logging.INFO)

logger.setLevel(log_level)
logger.info("Log level set to %s" % logger.getEffectiveLevel())

def lambda_handler(event, context):

def lambda_handler(event, _):
logger.info("Incoming event : {}".format(json.dumps(event)))
request = event['Records'][0]['cf']['request']
headers = request["headers"]
Expand All @@ -31,9 +31,11 @@ def lambda_handler(event, context):
logger.info("Returning request: %s" % json.dumps(request))
return request


def decode_body(encoded_body):
return base64.b64decode(encoded_body).decode('utf-8')


def calculate_payload_hash(payload):
## generate sha256 from payload
return hashlib.sha256(payload.encode('utf-8')).hexdigest()
7 changes: 3 additions & 4 deletions lambda/runtask_eventbridge/handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@
import logging
import urllib.parse
import boto3
import botocore
import botocore.session

from cgi import parse_header
Expand Down Expand Up @@ -63,7 +62,7 @@ class PutEventError(Exception):
"""Raised when Put Events Failed"""
pass

def lambda_handler(event, _context):
def lambda_handler(event, _):
"""Terraform run task function"""
logger.debug(json.dumps(event))

Expand Down Expand Up @@ -103,7 +102,7 @@ def lambda_handler(event, _context):
)
return {
"statusCode": 500,
"body": "FailedEntry Error - The entry could not be succesfully forwarded to Amazon EventBridge",
"body": "FailedEntry Error - The entry could not be successfully forwarded to Amazon EventBridge",
}

return {"statusCode": 200, "body": "Message forwarded to Amazon EventBridge"}
Expand Down Expand Up @@ -149,7 +148,7 @@ def contains_valid_cloudfront_signature(

def contains_valid_signature(event):
"""Check for the payload signature
HashiCorp Terraform run task documention: https://developer.hashicorp.com/terraform/cloud-docs/integrations/run-tasks#securing-your-run-task
HashiCorp Terraform run task documentation: https://developer.hashicorp.com/terraform/cloud-docs/integrations/run-tasks#securing-your-run-task
"""
secret = cache.get_secret_string(hcp_tf_hmac_secret_arn)
payload_bytes = get_payload_bytes(
Expand Down
11 changes: 4 additions & 7 deletions lambda/runtask_fulfillment/ai.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,15 @@
import json
import os

import boto3
import botocore
import logging
import subprocess
import os

from utils import logger, stream_messages, tool_config
from runtask_utils import generate_runtask_result
from tools.get_ami_releases import GetECSAmisReleases
from utils import logger, stream_messages, tool_config

# Initialize model_id and region
default_model_id = "anthropic.claude-3-sonnet-20240229-v1:0"
model_id = os.environ.get("BEDROCK_LLM_MODEL", default_model_id)
model_id = os.environ.get("BEDROCK_LLM_MODEL")
guardrail_id = os.environ.get("BEDROCK_GUARDRAIL_ID", None)
guardrail_version = os.environ.get("BEDROCK_GUARDRAIL_VERSION", None)

Expand Down Expand Up @@ -148,7 +146,6 @@ def eval(tf_plan_json):
tool = content["toolUse"]

if tool["name"] == "GetECSAmisReleases":
tool_result = {}

release_details = GetECSAmisReleases().execute(
tool["input"]["image_ids"]
Expand Down
14 changes: 6 additions & 8 deletions lambda/runtask_fulfillment/handler.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,7 @@
import os
import sys
import json
import time
import logging
import requests
import os

import boto3

import ai
Expand All @@ -20,7 +18,7 @@
cwl_client = session.client('logs')

# THIS IS THE MAIN FUNCTION TO IMPLEMENT BUSINESS LOGIC
# TO PROCESS THE TERRFORM PLAN FILE or TERRAFORM CONFIG (.tar.gz)
# TO PROCESS THE TERRAFORM PLAN FILE or TERRAFORM CONFIG (.tar.gz)
# SCHEMA - https://developer.hashicorp.com/terraform/cloud-docs/api-docs/run-tasks/run-tasks-integration#severity-and-status-tags
def process_run_task(type: str, data: str, run_id: str):
url = None
Expand Down Expand Up @@ -63,7 +61,7 @@ def write_run_task_log(run_id: str, results: list, cw_log_group_dest: str):
)

# Main handler for the Lambda function
def lambda_handler(event, context):
def lambda_handler(event, _):

logger.debug(json.dumps(event, indent=4))

Expand Down Expand Up @@ -103,12 +101,12 @@ def lambda_handler(event, context):
configuration_version_download_url, access_token
)
logger.debug(
f"Config downloaded for Workspace: {organization_name}/{workspace_name}, Run: {run_id}\n downloaded at {os.getcwd()}/config"
f"Config downloaded for Workspace: {organization_name}/{workspace_id}, Run: {run_id}\n downloaded at {os.getcwd()}/config"
)

# Run the implemented business logic here
url, status, message, results = process_run_task(
type="pre_plan", path=config_file, run_id=run_id
type="pre_plan", data=config_file, run_id=run_id
)

elif event["payload"]["detail"]["stage"] == "post_plan":
Expand Down
59 changes: 31 additions & 28 deletions lambda/runtask_fulfillment/runtask_utils.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,12 @@
import os
import re
import json
import tarfile
import hashlib
import logging
import requests
import os
import re
import time

from urllib.request import urlopen, Request
from urllib.error import HTTPError, URLError
from urllib.request import urlopen, Request

import requests

logging.basicConfig(format="%(levelname)s: %(message)s")
logger = logging.getLogger()
Expand All @@ -31,7 +29,7 @@ def download_config(configuration_version_download_url, access_token):
return config_file


def get_plan(url, access_token) -> str:
def get_plan(url, access_token) -> (str, str):
headers = {
"Authorization": f"Bearer {access_token}",
"Content-type": "application/vnd.api+json",
Expand Down Expand Up @@ -59,7 +57,7 @@ def get_plan(url, access_token) -> str:
except URLError as error:
logger.error(str(f"URL error: {error.reason}"))
return None, f"URL Error: {str(error)}"
except TimeoutError:
except TimeoutError as error:
logger.error(f"Timeout error: {str(error)}")
return None, f"Timeout Error: {str(error)}"
except Exception as error:
Expand All @@ -73,6 +71,7 @@ def validate_endpoint(endpoint):
result = re.match(pattern, endpoint)
return result


def generate_runtask_result(outcome_id, description, result):
result_json = json.dumps(
{
Expand Down Expand Up @@ -105,33 +104,37 @@ def convert_to_markdown(result):
return result


def log_helper(cwl_client, log_group_name, log_stream_name, log_message): # helper function to write RunTask results to dedicated cloudwatch log group
if log_group_name: # true if CW log group name is specified
def log_helper(cwl_client, log_group_name, log_stream_name,
log_message): # helper function to write RunTask results to dedicated cloudwatch log group
if log_group_name: # true if CW log group name is specified
global SEQUENCE_TOKEN
try:
SEQUENCE_TOKEN = log_writer(cwl_client, log_group_name, log_stream_name, log_message, SEQUENCE_TOKEN)["nextSequenceToken"]
SEQUENCE_TOKEN = log_writer(cwl_client, log_group_name, log_stream_name, log_message, SEQUENCE_TOKEN)[
"nextSequenceToken"]
except:
cwl_client.create_log_stream(logGroupName = log_group_name,logStreamName = log_stream_name)
cwl_client.create_log_stream(logGroupName=log_group_name, logStreamName=log_stream_name)
SEQUENCE_TOKEN = log_writer(cwl_client, log_group_name, log_stream_name, log_message)["nextSequenceToken"]

def log_writer(cwl_client, log_group_name, log_stream_name, log_message, sequence_token = False): # writer to CloudWatch log stream based on sequence token
if sequence_token: # if token exist, append to the previous token stream

def log_writer(cwl_client, log_group_name, log_stream_name, log_message,
sequence_token=False): # writer to CloudWatch log stream based on sequence token
if sequence_token: # if token exists, append to the previous token stream
response = cwl_client.put_log_events(
logGroupName = log_group_name,
logStreamName = log_stream_name,
logEvents = [{
'timestamp' : int(round(time.time() * 1000)),
'message' : time.strftime('%Y-%m-%d %H:%M:%S') + ": " + log_message
logGroupName=log_group_name,
logStreamName=log_stream_name,
logEvents=[{
'timestamp': int(round(time.time() * 1000)),
'message': time.strftime('%Y-%m-%d %H:%M:%S') + ": " + log_message
}],
sequenceToken = sequence_token
sequenceToken=sequence_token
)
else: # new log stream, no token exist
else: # new log stream, no token exist
response = cwl_client.put_log_events(
logGroupName = log_group_name,
logStreamName = log_stream_name,
logEvents = [{
'timestamp' : int(round(time.time() * 1000)),
'message' : time.strftime('%Y-%m-%d %H:%M:%S') + ": " + log_message
logGroupName=log_group_name,
logStreamName=log_stream_name,
logEvents=[{
'timestamp': int(round(time.time() * 1000)),
'message': time.strftime('%Y-%m-%d %H:%M:%S') + ": " + log_message
}]
)
return response
return response
Loading

0 comments on commit 8eeb9df

Please sign in to comment.