Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: allow config var for slug override #55

Merged
merged 5 commits into from
Nov 14, 2022
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 26 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ Plus, In addition to tarball & rsync, we also do not re-create another tarball f
- [Adjust compression level](#adjust-compression-level)
- [Continue to cache on failed builds](#continue-to-cache-on-failed-builds)
- [Multi-threaded compression](#multi-threaded-compression)
- [Sharing caches between pipelines](#sharing-caches-between-pipelines)
- [Auto deletion old caches](#auto-deletion-old-caches)
- [Globs on paths](#globs-on-paths)
- [Roadmap](#roadmap)
Expand Down Expand Up @@ -466,6 +467,31 @@ steps:
compress-program: pigz # tar will use `pigz` to compress and benefit multithreading...
```

## Sharing caches between pipelines

If you have multiple pipelines that can benefit from referencing the same cache, you can use the `pipeline-slug-override` option:

```yaml
steps:
- name: ':jest: Run tests'
key: jest
command: yarn test --runInBand
plugins:
- gencer/cache#v2.4.11:
id: ruby # or ruby-3.0
backend: s3
key: "v1-cache-{{ id }}-{{ runner.os }}-{{ checksum 'Gemfile.lock' }}"
restore-keys:
- 'v1-cache-{{ id }}-{{ runner.os }}-'
- 'v1-cache-{{ id }}-'
compress: 2 # fast compression.
s3:
bucket: s3-bucket
paths:
- bundle/vendor
pipeline-slug-override: "other-pipeline" # other-pipeline references the same Gemfile.lock
```

## Auto deletion old caches

For tarballs, To keep caches and delete them in _for example_ 7 days, use `max: 7`.
Expand Down
4 changes: 2 additions & 2 deletions lib/backends/rsync.bash
Original file line number Diff line number Diff line change
Expand Up @@ -14,14 +14,14 @@ if [ -n "${BUILDKITE_PLUGIN_CACHE_RSYNC_PATH:-}" ]; then
fi

function restore() {
CACHE_PREFIX="${BK_BASE_DIR}/${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}"
CACHE_PREFIX="${BK_BASE_DIR}/${BUILDKITE_ORGANIZATION_SLUG}/$(pipeline_slug)"

mkdir -p "${CACHE_PREFIX}/${CACHE_KEY}"
rsync -a "$RSYNC_ARGS" "${CACHE_PREFIX}/${CACHE_KEY}/" .
}

function cache() {
CACHE_PREFIX="${BK_BASE_DIR}/${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}"
CACHE_PREFIX="${BK_BASE_DIR}/${BUILDKITE_ORGANIZATION_SLUG}/$(pipeline_slug)"
mkdir -p "${CACHE_PREFIX}/${CACHE_KEY}/"

if [ "${#paths[@]}" -eq 1 ]; then
Expand Down
4 changes: 2 additions & 2 deletions lib/backends/s3.bash
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ fi

function restore() {
TAR_FILE="${CACHE_KEY}.${BK_TAR_EXTENSION}"
TKEY="${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}"
TKEY="${BUILDKITE_ORGANIZATION_SLUG}/$(pipeline_slug)"
BUCKET="${BUILDKITE_PLUGIN_CACHE_S3_BUCKET}/${TKEY}"
BK_AWS_FOUND=false

Expand Down Expand Up @@ -111,7 +111,7 @@ function restore() {

function cache() {
TAR_FILE="${CACHE_KEY}.${BK_TAR_EXTENSION}"
BUCKET="${BUILDKITE_PLUGIN_CACHE_S3_BUCKET}/${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}"
BUCKET="${BUILDKITE_PLUGIN_CACHE_S3_BUCKET}/${BUILDKITE_ORGANIZATION_SLUG}/$(pipeline_slug)"
TAR_TARGETS=""

if [ "${#paths[@]}" -eq 1 ]; then
Expand Down
6 changes: 3 additions & 3 deletions lib/backends/tarball.bash
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,8 @@ else
fi

function restore() {
CACHE_PREFIX="${BK_BASE_DIR}/${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}"
mkdir -p "${CACHE_PREFIX}/${BUILDKITE_PIPELINE_SLUG}"
CACHE_PREFIX="${BK_BASE_DIR}/${BUILDKITE_ORGANIZATION_SLUG}/$(pipeline_slug)"
mkdir -p "${CACHE_PREFIX}/$(pipeline_slug)"
TAR_FILE="${CACHE_PREFIX}/${CACHE_KEY}.${BK_TAR_EXTENSION}"
BK_TAR_FOUND=false

Expand Down Expand Up @@ -88,7 +88,7 @@ function restore() {
}

function cache() {
CACHE_PREFIX="${BK_BASE_DIR}/${BUILDKITE_ORGANIZATION_SLUG}/${BUILDKITE_PIPELINE_SLUG}"
CACHE_PREFIX="${BK_BASE_DIR}/${BUILDKITE_ORGANIZATION_SLUG}/$(pipeline_slug)"

mkdir -p "${CACHE_PREFIX}"
DAYS="${BUILDKITE_PLUGIN_CACHE_TARBALL_MAX:-}"
Expand Down
7 changes: 7 additions & 0 deletions lib/shared.bash
Original file line number Diff line number Diff line change
Expand Up @@ -88,3 +88,10 @@ function source_locating() {
function cache_locating() {
echo -e "${BK_LOG_PREFIX}🔍 Locating cache: $1"
}

# Value to be used as the pipeline slug
# Returns:
# - String
function pipeline_slug() {
echo ${BUILDKITE_PLUGIN_CACHE_PIPELINE_SLUG_OVERRIDE:-${BUILDKITE_PIPELINE_SLUG}}
gencer marked this conversation as resolved.
Show resolved Hide resolved
}