Skip to content

Commit

Permalink
Trying to split testing in GHA and use docker buildx to have cache of…
Browse files Browse the repository at this point in the history
… builds (#108)

* First stuff -trying to get bake to work in gha and local testing

* Set uid,gid in gha

* Bake yml separate from run yml

* Test echo output

* Test echo output 2

* Test if using buildx bake in other job leads to caching from prev job where it is also used

* Try registry instead of gha for cache

* Login to ghcr first

* Login to ghcr first2

* Also use cache in unit tests, write docker cmds in gha yml

* Also use cache in unit tests, write docker cmds in gha yml - typo

* Fix unittest bake thing

* Clean a bit, try to pull images also in the bake step for caching

* Increase sleep time to see if that makes jobs complete on gha

* Put timeouts in place to see what is wrong on gha

* More sleep

* Try to get stdout for erroring jobs on specific gha

* Show logs of tasks to find out if task runs at all

* stupid typo

* Put rsync key in docker image so file permissions are correct - for testing

* Fix key into right, readable location

* Decrease sleep times a bit, catch errors to show and then fail, and explicitly check last line so it wont be a wait in case of slow operation or something
  • Loading branch information
glormph authored Oct 30, 2024
1 parent 3598dc2 commit a4122f6
Show file tree
Hide file tree
Showing 11 changed files with 226 additions and 108 deletions.
14 changes: 0 additions & 14 deletions .github/run_tests_integration.sh

This file was deleted.

14 changes: 0 additions & 14 deletions .github/run_tests_unit.sh

This file was deleted.

67 changes: 61 additions & 6 deletions .github/workflows/integration-tests.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name: integration-tests
run-name: Run integration tests
name: lint-and-test
run-name: Run linting and tests
on:
push:
branches:
Expand All @@ -12,21 +12,76 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4

- name: Setup buildx
uses: docker/setup-buildx-action@v3
- run: docker buildx bake --file src/docker/docker-compose-testing.yml --file src/docker/docker-compose-gha-cache.json
- run: bash run_lint.sh

- name: Get USER_ID to env
run: echo "USER_ID=$(id -u)" >> $GITHUB_ENV

- name: Get GROUP_ID to env
run: echo "GROUP_ID=$(id -g)" >> $GITHUB_ENV

- name: Login to Docker Registry, bake/cache
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

- run: docker buildx bake --file src/docker/testing-bake.yml --file src/docker/docker-compose-gha-cache.json
- run: |
docker compose --env-file src/docker/.compose.testing.env \
-f src/docker/docker-compose-testing.yml run web pylint \
-E --disable E1101,E0307 --ignore-paths \
'.*\/migrations\/[0-9]+.*.py' analysis datasets dashboard home jobs kantele rawstatus
integration-tests:
runs-on: ubuntu-latest
needs: linting
steps:
- uses: actions/checkout@v4
- run: bash .github/run_tests_integration.sh
- name: Setup buildx
uses: docker/setup-buildx-action@v3
- name: Get USER_ID to env
run: echo "USER_ID=$(id -u)" >> $GITHUB_ENV
- name: Get GROUP_ID to env
run: echo "GROUP_ID=$(id -g)" >> $GITHUB_ENV
- name: Login to Docker Registry to get bake cache
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- run: docker buildx bake --file src/docker/testing-bake.yml --file src/docker/docker-compose-gha-cache.json
- run: |
docker compose --env-file src/docker/.compose.testing.env -f src/docker/docker-compose-testing.yml \
up --detach db mq
sleep 5 # DB needs to be up or app crashes trying to connect
docker compose --env-file src/docker/.compose.testing.env -f src/docker/docker-compose-testing.yml \
run --use-aliases web python manage.py test --tag slow --exclude-tag mstulos || docker compose --env-file src/docker/.compose.testing.env -f src/docker/docker-compose-testing.yml logs storage_mvfiles storage_downloads
unit-tests:
runs-on: ubuntu-latest
needs: linting
steps:
- uses: actions/checkout@v4
- run: bash .github/run_tests_unit.sh
- name: Setup buildx
uses: docker/setup-buildx-action@v3
- name: Get USER_ID to env
run: echo "USER_ID=$(id -u)" >> $GITHUB_ENV
- name: Get GROUP_ID to env
run: echo "GROUP_ID=$(id -g)" >> $GITHUB_ENV
- name: Login to Docker Registry to get bake cache
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- run: docker buildx bake --file src/docker/testing-bake.yml --file src/docker/docker-compose-gha-cache.json
- run: |
docker compose --env-file src/docker/.compose.testing.env -f src/docker/docker-compose-testing.yml \
up --detach db mq
sleep 5 # DB needs to be up or app crashes trying to connect
docker compose --env-file src/docker/.compose.testing.env -f src/docker/docker-compose-testing.yml \
run --use-aliases web python manage.py test --exclude-tag slow
56 changes: 47 additions & 9 deletions src/backend/rawstatus/tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ def test_new_file(self):
old_raw = rm.RawFile.objects.last()
sp = self.run_script(fullp)
# Give time for running script, so job is created before running it etc
sleep(1)
sleep(2)
new_raw = rm.RawFile.objects.last()
self.assertEqual(new_raw.pk, old_raw.pk + 1)
sf = rm.StoredFile.objects.last()
Expand All @@ -117,20 +117,31 @@ def test_new_file(self):
self.assertFalse(sf.checked)
self.run_job()
sf.refresh_from_db()
try:
spout, sperr = sp.communicate(timeout=10)
except subprocess.TimeoutExpired:
sp.terminate()
# Properly kill children since upload.py uses multiprocessing
os.killpg(os.getpgid(sp.pid), signal.SIGTERM)
spout, sperr = sp.communicate()
print(sperr.decode('utf-8'))
raise
self.assertTrue(sf.checked)
spout, sperr = sp.communicate()
explines = ['Registering 1 new file(s)',
f'File {new_raw.name} matches remote file {new_raw.name} with ID {new_raw.pk}',
f'Checking remote state for file {new_raw.name} with ID {new_raw.pk}',
f'State for file {new_raw.name} with ID {new_raw.pk} was: transfer',
f'Uploading {fullp} to {self.live_server_url}',
f'Succesful transfer of file {fullp}',
f'Checking remote state for file {new_raw.name} with ID {new_raw.pk}',
f'State for file with ID {new_raw.pk} was "done"']
for out, exp in zip(sperr.decode('utf-8').strip().split('\n'), explines):
]
outlines = sperr.decode('utf-8').strip().split('\n')
for out, exp in zip(outlines, explines):
out = re.sub('.* - INFO - .producer.worker - ', '', out)
out = re.sub('.* - INFO - root - ', '', out)
self.assertEqual(out, exp)
lastexp = f'State for file with ID {new_raw.pk} was "done"'
self.assertEqual(re.sub('.* - INFO - .producer.worker - ', '', outlines[-1]), lastexp)

def test_transfer_again(self):
'''Transfer already existing file, e.g. overwrites of previously
Expand All @@ -154,7 +165,15 @@ def test_transfer_again(self):
self.f3sf.refresh_from_db()
self.assertFalse(self.f3sf.checked)
self.run_job()
spout, sperr = sp.communicate()
try:
spout, sperr = sp.communicate(timeout=10)
except subprocess.TimeoutExpired:
sp.terminate()
# Properly kill children since upload.py uses multiprocessing
os.killpg(os.getpgid(sp.pid), signal.SIGTERM)
spout, sperr = sp.communicate()
print(sperr.decode('utf-8'))
raise
self.f3sf.refresh_from_db()
self.assertTrue(self.f3sf.checked)
self.assertEqual(self.f3sf.md5, self.f3raw.source_md5)
Expand All @@ -166,11 +185,14 @@ def test_transfer_again(self):
f'Uploading {fullp} to {self.live_server_url}',
f'Succesful transfer of file {fullp}',
f'Checking remote state for file {self.f3raw.name} with ID {self.f3raw.pk}',
f'State for file with ID {self.f3raw.pk} was "done"']
for out, exp in zip(sperr.decode('utf-8').strip().split('\n'), explines):
]
outlines = sperr.decode('utf-8').strip().split('\n')
for out, exp in zip(outlines, explines):
out = re.sub('.* - INFO - .producer.worker - ', '', out)
out = re.sub('.* - INFO - root - ', '', out)
self.assertEqual(out, exp)
lastexp = f'State for file with ID {self.f3raw.pk} was "done"'
self.assertEqual(re.sub('.* - INFO - .producer.worker - ', '', outlines[-1]), lastexp)

def test_transfer_same_name(self):
# Test trying to upload file with same name/path but diff MD5
Expand Down Expand Up @@ -232,7 +254,15 @@ def test_transfer_file_namechanged(self):
sp = self.run_script(fullp)
sleep(1)
self.run_job()
spout, sperr = sp.communicate()
try:
spout, sperr = sp.communicate(timeout=10)
except subprocess.TimeoutExpired:
sp.terminate()
# Properly kill children since upload.py uses multiprocessing
os.killpg(os.getpgid(sp.pid), signal.SIGTERM)
spout, sperr = sp.communicate()
print(sperr.decode('utf-8'))
raise
newsf = rm.StoredFile.objects.last()
self.assertEqual(newsf.pk, lastsf.pk + 1)
self.assertEqual(rawfn.pk, newsf.rawfile_id)
Expand Down Expand Up @@ -272,7 +302,15 @@ def test_rsync_not_finished_yet(self):
job.save()
self.f3sf.checked = True
self.f3sf.save()
spout, sperr = sp.communicate()
try:
spout, sperr = sp.communicate(timeout=10)
except subprocess.TimeoutExpired:
sp.terminate()
# Properly kill children since upload.py uses multiprocessing
os.killpg(os.getpgid(sp.pid), signal.SIGTERM)
spout, sperr = sp.communicate()
print(sperr.decode('utf-8'))
raise
explines = [f'Checking remote state for file {self.f3raw.name} with ID {self.f3raw.pk}',
f'State for file {self.f3raw.name} with ID {self.f3raw.pk} was: wait',
f'Checking remote state for file {self.f3raw.name} with ID {self.f3raw.pk}',
Expand Down
5 changes: 3 additions & 2 deletions src/docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,9 @@ FROM django AS django_test
USER root
RUN apt-get install -y rsync
RUN pip install pylint
RUN mkdir /kantelessh && chmod 700 /kantelessh && chown kantele: /kantelessh
USER kantele
RUN mkdir /home/kantele/.ssh && chmod 700 /home/kantele/.ssh
COPY docker/test_rsync_sshkey /kantelessh/rsync_key

# Build django static for the nginx prod container
FROM django AS django_static
Expand All @@ -36,7 +37,7 @@ RUN python manage.py collectstatic --no-input

# Compile node frontend stuff for prod container
FROM node:18 AS build_frontend
COPY ./frontend /src/frontend
COPY frontend /src/frontend
RUN cd /src/frontend/analysis && npm install && npm run build
RUN cd /src/frontend/dashboard && npm install && npm run build
RUN cd /src/frontend/datasets && npm install && npm run build
Expand Down
53 changes: 27 additions & 26 deletions src/docker/docker-compose-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,29 +2,30 @@
#

services:
storage:
image: django_test
build:
context: "${BUILD_CONTEXT_DEV:-../}"
dockerfile: ./docker/Dockerfile
target: django_test
args:
USER_ID: "${USER_ID:-You must run export USER_ID}"
GROUP_ID: "${GROUP_ID:-You must run export GROUP_ID}"
volumes:
- ../../data/storage:/storage
- ../../data/newstorage:/newstorage
- ../../data/analysisfiles:/analysisfiles
- ../backend:/kantele
environment:
APIKEY: "${APIKEY_STORAGE}"
KANTELEHOST: nginx
PROTOCOL: http
TMPSHARE: '/s3storage/tmp'
ANALYSISSHARE: /analysisfiles
RABBITHOST: mq
RABBIT_VHOST: "${RABBIT_VHOST}"
RABBITUSER: "${RABBITUSER:-guest}"
RABBITPASS: "${RABBITPASS:-guest}"
STORAGESHARES: '/storage,/newstorage'
PRIMARY_STORAGE: s3storage
storage:
image: django_test
build:
context: "${BUILD_CONTEXT_DEV:-../}"
#context: ../../
dockerfile: ./docker/Dockerfile
target: django_test
args:
USER_ID: "${USER_ID:-You must run export USER_ID}"
GROUP_ID: "${GROUP_ID:-You must run export GROUP_ID}"
volumes:
- ../../data/storage:/storage
- ../../data/newstorage:/newstorage
- ../../data/analysisfiles:/analysisfiles
- ../backend:/kantele
environment:
APIKEY: "${APIKEY_STORAGE}"
KANTELEHOST: nginx
PROTOCOL: http
TMPSHARE: '/s3storage/tmp'
ANALYSISSHARE: /analysisfiles
RABBITHOST: mq
RABBIT_VHOST: "${RABBIT_VHOST}"
RABBITUSER: "${RABBITUSER:-guest}"
RABBITPASS: "${RABBITPASS:-guest}"
STORAGESHARES: '/storage,/newstorage'
PRIMARY_STORAGE: s3storage
28 changes: 14 additions & 14 deletions src/docker/docker-compose-gha-cache.json
Original file line number Diff line number Diff line change
@@ -1,37 +1,37 @@
{"target": {
"web": {
"cache-from": ["type=gha"],
"cache-to": ["type=gha"],
"cache-from": ["type=registry,ref=ghcr.io/glormph/kantele-web:cache"],
"cache-to": ["type=registry,ref=ghcr.io/glormph/kantele-web:cache"],
"output": ["type=docker"]
},
"db": {
"cache-from": ["type=gha"],
"cache-to": ["type=gha"],
"cache-from": ["type=registry,ref=ghcr.io/glormph/kantele-db:cache"],
"cache-to": ["type=registry,ref=ghcr.io/glormph/kantele-db:cache"],
"output": ["type=docker"]
},
"mq": {
"cache-from": ["type=gha"],
"cache-to": ["type=gha"],
"cache-from": ["type=registry,ref=ghcr.io/glormph/kantele-mq:cache"],
"cache-to": ["type=registry,ref=ghcr.io/glormph/kantele-mq:cache"],
"output": ["type=docker"]
},
"storage_mvfiles": {
"cache-from": ["type=gha"],
"cache-to": ["type=gha"],
"cache-from": ["type=registry,ref=ghcr.io/glormph/kantele-stor:cache"],
"cache-to": ["type=registry,ref=ghcr.io/glormph/kantele-stor:cache"],
"output": ["type=docker"]
},
"tulos_ingester": {
"cache-from": ["type=gha"],
"cache-to": ["type=gha"],
"cache-from": ["type=registry,ref=ghcr.io/glormph/kantele-stor:cache"],
"cache-to": ["type=registry,ref=ghcr.io/glormph/kantele-stor:cache"],
"output": ["type=docker"]
},
"storage_downloads": {
"cache-from": ["type=gha"],
"cache-to": ["type=gha"],
"cache-from": ["type=registry,ref=ghcr.io/glormph/kantele-stor:cache"],
"cache-to": ["type=registry,ref=ghcr.io/glormph/kantele-stor:cache"],
"output": ["type=docker"]
},
"upload_bay_rsync": {
"cache-from": ["type=gha"],
"cache-to": ["type=gha"],
"cache-from": ["type=registry,ref=ghcr.io/glormph/kantele-rsync:cache"],
"cache-to": ["type=registry,ref=ghcr.io/glormph/kantele-rsync:cache"],
"output": ["type=docker"]
}
}
Expand Down
8 changes: 4 additions & 4 deletions src/docker/docker-compose-prod.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ services:
- db
- mq
env_file:
- ./prod-container.env
- ./src/docker/prod-container.env
logging:
driver: syslog
options:
Expand All @@ -24,8 +24,8 @@ services:
- web
image: kantele_nginx
build:
context: ../
dockerfile: ./docker/Dockerfile
context: ./
dockerfile: ./src/docker/Dockerfile
target: nginx_prod
args:
USER_ID: "${USER_ID:-You must run export USER_ID}"
Expand Down Expand Up @@ -66,7 +66,7 @@ services:
- mq
network_mode: host
env_file:
- ./prod-container.env
- ./src/docker/prod-container.env
logging:
driver: syslog
options:
Expand Down
Loading

0 comments on commit a4122f6

Please sign in to comment.