Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Too much time spend in CI #910

Closed
jkoan opened this issue Oct 13, 2019 · 15 comments
Closed

Too much time spend in CI #910

jkoan opened this issue Oct 13, 2019 · 15 comments

Comments

@jkoan
Copy link
Member

jkoan commented Oct 13, 2019

When looking into this i see that 3 builds take over 10mins. This is a problem because we only have limited time to spend on the ci. Which we BTW already exceed a lot (strange that we can still build)

For run_doxygen my suggestion would be to store no artifacts since they are already stored within github or zip and upload them. The problem there is only about the small file sizes.

For build_linux again the artifact upload is the problem, we dont need to upload the unpacked files, *STGZ,TGZ,TZ should be enough.

For build_sailfish i have no clue, this takes long, but i dont have analyzed it right now.

We should also look into the other builds but they are not so high in build times.

https://circleci.com/workflow-run/68e792ae-fb84-47ed-9206-d42eafaecd02

@lains
Copy link
Contributor

lains commented Oct 13, 2019

And it seems some builds are on hold (at least, build results for #812 have been pending for one hour now, it seems that not having these steps completed prevents merging)

@jkoan
Copy link
Member Author

jkoan commented Oct 13, 2019

Thats not true, but the results are now shown on the github checks tab

@jandegr
Copy link
Contributor

jandegr commented Oct 13, 2019

Hi,
How does circleCI measure time used ?

From the start of the workflow to the end of it or as the summ of time consumed by each induvidual container ?

Either way, I don't see the point in making all the other steps wait untill astyle finishes, maybe it makes sense to let 3 others run aside of it since a failure of astyle will block a PR anyway.
Or would this increase time used on ci because of the 3 that were allowed to run next to astyle in case astyle fails and blocks the PR ?

Years ago when I ported the tomtom build over from the autotools version on the old buildserver to a cmake version on CI, I got a remark for building both the plugin version and the full version, so I disabled one of them (don't remember which one) but I did not delete that buildscript. However since quite a while I see both get built again. I am not involved in the tomtom builds andymore but I suppose you could poll those that still are into tomtom which version they prefer, they can still build the other one on a need to basis since the script to do so would not be deleted.

The tomtom remark falls under

We should also look into the other builds but they are not so high in build times.

and so does #911

@jandegr jandegr mentioned this issue Oct 13, 2019
@jkoan
Copy link
Member Author

jkoan commented Oct 13, 2019

How does circleCI measure time used ?

It's the sum of all invidual containers including setup and artifact upload. So when we already know that the PR will fail due to astyle we can safe time here

@jandegr
Copy link
Contributor

jandegr commented Oct 13, 2019

Ok, thx
So this leaves the remark about tomtom and android translations I made higher

  • new remark : why build everything all over that has already been built when merging into master ?
    Normally Android would then build a release version and doxygen and download_center when merging into master but most of the other builds just do the same over again.

@hoehnp
Copy link
Contributor

hoehnp commented Oct 13, 2019

@jkoan: sailfish needs probably more time because we need to run circleci using the machine executor and then have to install docker. It was never running reliable using the standard procedure we have with the other builds.

@metalstrolch
Copy link
Contributor

metalstrolch commented Oct 13, 2019

Not to forget that sailfish is essentially building twice. One time for arm, and another time for i586.

@metalstrolch
Copy link
Contributor

metalstrolch commented Oct 13, 2019

Another Idea:

Can we have or own base docker image? Or better even multiple ones?
Currently we spend a lot of time apt-getting packages into the almost empty base Ubuntu 18.01 image we use. It would help a lot when we could use an image already containing the dependencies. We could still apt-get them to check for updates on Ubuntu side.
For the Linux build this takes even more time than the build.
Or for Android we install every time the NDK. It's not "installing the NDK" we want to test, but build using it.
But I Don't know if cirrcleci allows this.

@jkoan
Copy link
Member Author

jkoan commented Oct 13, 2019

Another Idea:

Can we have or own base docker image? Or better even multiple ones?
Currently we spend a lot of time apt-getting packages into the almost empty base Ubuntu 18.01 image we use. It would help a lot when we could use an image already containing the dependencies. We could still apt-get them to check for updates on Ubuntu side.
For the Linux build this takes even more time than the build.
Or for Android we install every time the NDK. It's not "installing the NDK" we want to test, but build using it.
But I Don't know if cirrcleci allows this.

Yes, this is definitely the way to go. As well we could probably cache all icon convert stuff via circleci. For Docker we have a Dockerfile repo, for new images just open a issue, then I will setup automatic build

@hoehnp
Copy link
Contributor

hoehnp commented Oct 14, 2019

@metalstrolch what do you think about ignoring the build for i586? I am not aware of any device really running sailfish os on it.

@metalstrolch
Copy link
Contributor

@hoehnp That's not entirely true. There is one device: The ill-fated Sailfish tablet. And you can use it for the Sailfish emulator in Virtualbox coming with the regular Sailfish SDK. Looking at the statistics from openrepos, there have been at least some downloads for i586.

Since I don't own a tablet, and did not use the emulator for ages, I think we can drop i586 support on ci. However I opt for keeping the infrastructure for it intact, so that with minimal modification one can do a i586 build in a branch if one wants to or somebody on openrepos requests.

Personally I'm waiting for the first arm64 build of sailfish, depending on which devices Jolla or the Russians deploy it next. But that's another story.

@aerostitch
Copy link
Contributor

aerostitch commented Nov 1, 2019

Changing to a pre-baked image changes a lot:
https://circleci.com/gh/navit-gps/navit/18810 is the sanity check just before I changed the image to the one I created from navit-gps/Dockerfiles#9. It was lasting 4:10

This is the subsequent run of the same stage but with the new pre-baked image: https://circleci.com/gh/navit-gps/navit/18903 it lasted 1:04

@hoehnp
Copy link
Contributor

hoehnp commented Nov 4, 2019

Should we also remove update of translations from being built completely?

@mvglasow
Copy link
Contributor

Does CircleCI offer some kind of priority for concurrent jobs?

From what I can see, we are limited to 2 concurrent builds; if we spawn more (which we do), then two jobs get picked, and as soon as one of them finishes, another one gets to run, until everything is finished. The order seems quite random and not reproducible.

Execution time for our jobs differs greatly. If the longest-running job is the last to start, the other “slot” will finish long before the last, long job does, which is less-than-perfect use of available resources. In a perfect world, jobs would be scheduled in such a manner that both slots finish their work at (almost) the same time. In order to get closer to this, we should prioritize jobs based on the time we expect them to take, longer-running jobs first.

Is there a way to assign a priority to jobs?

@jkoan
Copy link
Member Author

jkoan commented Mar 1, 2020

This issue can be closed, because i made a mistake where i looked at the wrong build times. Also we reduced the time quite a bit.

@jkoan jkoan closed this as completed Mar 1, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants
@lains @mvglasow @jkoan @metalstrolch @aerostitch @hoehnp @jandegr and others