Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Save git commit of builds in started.json in addition to version #988

Closed
wants to merge 1 commit into from

Conversation

ixdy
Copy link
Member

@ixdy ixdy commented Nov 4, 2016

Fixes #987, at least for bootstrapped builds.

My python is a bit rusty, so hopefully I didn't screw anything up too badly.


This change is Reviewable

@ixdy ixdy assigned rmmh and spxtr Nov 4, 2016
@ixdy ixdy force-pushed the bootstrap-save-git-commit branch from b44420b to 75a0959 Compare November 4, 2016 00:21
@ixdy
Copy link
Member Author

ixdy commented Nov 4, 2016

Fixing this for upload-to-gcs.sh too seems like more effort than it's worth.

@spxtr
Copy link
Contributor

spxtr commented Nov 4, 2016

$ pylint jenkins/bootstrap.py
************* Module bootstrap
W:345, 0: Anomalous backslash in string: '\{'. String constant might be missing an r prefix. (anomalous-backslash-in-string)
W:345, 0: Anomalous backslash in string: '\}'. String constant might be missing an r prefix. (anomalous-backslash-in-string)
C:345, 0: Line too long (121/100) (line-too-long)
C:344,16: Invalid variable name "m" (invalid-name)

'bash', '-c', (
"""Determine and return the version and git commit of the build."""
try:
# First try using kubectl
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is going to work for a dockerized test.

Copy link
Member Author

@ixdy ixdy Nov 4, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hm, now that I think about this, I think neither the previous version of this nor this kubectl version work correctly for CI, since we haven't extracted the kubernetes tarball by this point.

@@ -333,29 +334,42 @@ def node():


def find_version():
"""Determine and return the version of the build."""
version_file = 'version'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are we just going to ignore this file now? what was making it before?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's written when the tarball is created - https://github.com/kubernetes/kubernetes/blob/master/build-tools/lib/release.sh#L428.

it should be the same as the GitVersion in kubectl version - but since we're parsing that already, it didn't seem like reading another file was necessary anymore.

@ixdy
Copy link
Member Author

ixdy commented Nov 4, 2016

I'm not sure how to make progress. We can't get the version until we extract the kubernetes tarball, which we're currently doing inside docker. I might wait for @fejta to return to get his opinion.

@spxtr
Copy link
Contributor

spxtr commented Nov 4, 2016

Yeah, this is tricky. I think @fejta was annoyed it it at one point too, not sure if we had any good ideas how to avoid it.

@spxtr spxtr assigned fejta and unassigned spxtr Nov 8, 2016
@ixdy
Copy link
Member Author

ixdy commented Nov 9, 2016

I think we probably want to pull this code into a separate python script which is called by e2e-runner.sh inside the e2e docker container.

@fejta
Copy link
Contributor

fejta commented Nov 10, 2016

Thoughts on #1034?

@ixdy
Copy link
Member Author

ixdy commented Nov 10, 2016

This is related to #1034, though a separate effort. We need to pull out computing job-version (or whatever we call it) into a separate script, which #1034 will consume. We can then easily save the job-git-commit at the same time.

@fejta
Copy link
Contributor

fejta commented Nov 11, 2016

I don't see how this is needed? I think we can solve this without making bootstrap.py specific to kubernetes.

#1034 relies on the fact that something will export the job-version to metadata.json, which will then get put into finished.json, which gubernator and testgrid can then consume.

kubernetes/kubernetes#36654 is what exports this to metadata.json

@ixdy
Copy link
Member Author

ixdy commented Nov 11, 2016

See #987, which is the issue this PR was trying to fix.

Basically, the kubernetes version doesn't always include the git sha1 hash, but our infrastructure expects it. We could make all of our tools figure out how to resolve tags into sha1 hashes, or we could just include the hash along with the rest of the metadata.

I was attempting to do the latter in this PR, but then I realized that it wouldn't work, since we don't check out kubernetes until bootstrap.py actually runs the job. I was waiting to see if we were likely to change this behavior, but it seems not. Thus this PR will not work as-is, and the logic needs to be moved elsewhere.

#1034 and kubernetes/kubernetes#36654 still haven't fixed the original issue this PR was created to resolve.

@ixdy ixdy assigned ixdy and unassigned rmmh and fejta Nov 11, 2016
@spxtr spxtr removed the cla: yes label Dec 16, 2016
@ixdy ixdy closed this Jan 25, 2017
@ixdy ixdy deleted the bootstrap-save-git-commit branch May 15, 2018 23:50
grantr pushed a commit to grantr/test-infra that referenced this pull request Feb 21, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants