Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[3.2] Add logs to artifacts #550

Merged
merged 7 commits into from
Jul 5, 2022
Merged

[3.2] Add logs to artifacts #550

merged 7 commits into from
Jul 5, 2022

Conversation

heifner
Copy link
Member

@heifner heifner commented Jun 28, 2022

Store off logs as Github Workflow artifacts.
This does not address everything we would like to see archived, but it is a good first step.

@heifner heifner changed the title GH-139 Add logs to artifacts [3.2] Add logs to artifacts Jun 29, 2022
@heifner heifner added the OCI OCI working this issue... label Jun 29, 2022
@heifner heifner marked this pull request as ready for review June 29, 2022 14:42
@heifner heifner requested a review from linh2931 June 30, 2022 00:16
@heifner heifner changed the base branch from main to release/3.1.x June 30, 2022 00:17
@heifner heifner changed the base branch from release/3.1.x to main June 30, 2022 00:17
with:
name: parallel-logs
path: |
build/var/*
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went to the artifacts page https://github.com/eosnetworkfoundation/mandel/actions/runs/2580037524#artifacts for this PR's CICD and downloaded forked-chain-logs and irreversible-logs. I only saw var/subprocess_results.log and etc/eosio/launcher/testnet.template. I did not see directories like var/lib/node_00, etc/eosio/node_00, ...

Not sure you need to specify all the directory levels like build/var/lib/*/* to get build/var/lib/node_00, or just build to get the whole directory. Or those node directories were somehow deleted.

Copy link
Member Author

@heifner heifner Jun 30, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They are cleaned up by the tests on successful runs. We might choose to add --keep-logs to our tests in the future so they are preserved for successful runs, but we do not currently do that.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just kicked off: #567 so we can verify logs are saved on failed tests.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@heifner heifner mentioned this pull request Jun 30, 2022
Copy link
Member

@spoonincode spoonincode left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like you need to set continue-on-error: true for the test step so that the upload step still executes when tests fail

@heifner
Copy link
Member Author

heifner commented Jul 1, 2022

It looks like you need to set continue-on-error: true for the test step so that the upload step still executes when tests fail

The continue-on-error: true would mark the test as passing when it failed. Used if: always() instead. See #567 for example run with test failures.

@spoonincode
Copy link
Member

It seems like the upload was removed from some jobs? Like parallel and wasm tests.

@heifner
Copy link
Member Author

heifner commented Jul 1, 2022

It seems like the upload was removed from some jobs? Like parallel and wasm tests.

Yes. Those jobs do not log anything to var/* or etc/*. They only have the output from ctest. I did add -V to ctest.

Copy link
Member

@linh2931 linh2931 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I saw logs now in the test PR. Just one thing. Those saved files are very big, 4GB for a test. Will the copying slow down in successful tests? I think we don't need to pay for the storage.

@heifner
Copy link
Member Author

heifner commented Jul 1, 2022

I saw logs now in the test PR. Just one thing. Those saved files are very big, 4GB for a test. Will the copying slow down in successful tests? I think we don't need to pay for the storage.

My original thought was that it might be useful to compare a successful run output, but since we are not currently passing --keep-logs to the tests, it is actually of little value. I'll switch this to if: failure()

Copy link
Member

@spoonincode spoonincode left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one sec

@spoonincode
Copy link
Member

I think you should instead tarball up (w/ zstd) these logs in to a single artifact upload (per job, of course) Based on

https://github.com/actions/upload-artifact#too-many-uploads-resulting-in-429-responses

and how long it took for my download of this zip file to start, I suspect you're really actually uploading 10+GB when the logs are uploaded

@spoonincode
Copy link
Member

okay good news is that it is gzip compressing each file but it's still going to be a lot more efficient if we just tarball these up before sending them. In particular tar will handle those shared_memory.bin sparse files a lot better.

@spoonincode
Copy link
Member

ugh
https://github.com/actions/upload-artifact/blob/3cea5372237819ed00197afe530f5a7ea3e805c8/dist/index.js#L7998-L8005
maybe not zstd then. or lie about the extension 🤷 Too bad they don't have a parameter to skip compression attempt

@heifner heifner requested a review from spoonincode July 2, 2022 18:04
@heifner heifner merged commit e816432 into main Jul 5, 2022
@heifner heifner deleted the save-logs branch July 5, 2022 12:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
OCI OCI working this issue...
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants