Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build: include build summary at the end of the build #63

Closed
wjwwood opened this issue Jun 9, 2014 · 15 comments
Closed

build: include build summary at the end of the build #63

wjwwood opened this issue Jun 9, 2014 · 15 comments

Comments

@wjwwood
Copy link
Member

wjwwood commented Jun 9, 2014

This summary should include things like workspaces being extended and compiler being used and stuff like that.

The idea is that having this at the end of the build would better notify users of their build environment.

@jbohren
Copy link
Contributor

jbohren commented Jun 10, 2014

Here's an example of what the build summaries could contain:
https://gist.github.com/jbohren/6e620aa166052048ebf2

I think it would be great if the build summaries were generated as valid YAML, as they are in this example. Then we could easily parse it down the road if we wanted to.

Each of them has the following sections:

  • Build Summary
    • Contains nominal information about the build
      • how long it took
      • which run-spaces it built into
      • what run-spaces builds were based on
  • Error Summary
    • Contains error statistics
    • Contains an entry with additional debug data for each package which failed including:
      • In which stage the build failed (configure / compile)
      • Where each of its dependencies was resolved
  • Advisories
    • Contains helpful suggestions about common pitfalls like the following:
      • You built into a particular runspace but haven't sourced it yet
      • You built a workspace and extended a runspace which isn't currently sourced

@mikepurvis
Copy link
Member

+1 especially to the analytics about where build dependencies came from.

@jbohren
Copy link
Contributor

jbohren commented Jun 10, 2014

+1 especially to the analytics about where build dependencies came from.

Yeah I forget who mentioned this during the telecon but I think that would be a huge timesaver.

@NikolausDemmel
Copy link
Member

+1 for the idea, in particular also to include advisories.

You built into a particular runspace but haven't sourced it yet

Aren't you meant to avoid sourcing the 'run space' you are building into?

@jbohren
Copy link
Contributor

jbohren commented Jun 11, 2014

Aren't you meant to avoid sourcing the 'run space' you are building into?

Well you have to source it at some point if you want to start using the products that are built there.

@NikolausDemmel
Copy link
Member

Sure. We'll have to decide what to recommend at some point (for or against building a ws in the environment where one of it's own run spaces is sourced; recommending for this is subtly broken; recommending against it goes against the current usage of many not-so-involved almost all users apparently). This point was an OT diversion by me for this thread however, sorry.

@mikepurvis
Copy link
Member

What exactly is it which breaks? Not to get the discussion off-track, but rebuilding after sourcing is absolutely my usage model.

Is the expectation that your build and launch/testing occurs in separate terminals? If so, I don't think this is communicated at all in the tutorials.

@jbohren
Copy link
Contributor

jbohren commented Jun 11, 2014

What exactly is it which breaks? Not to get the discussion off-track, but rebuilding after sourcing is absolutely my usage model.

Yeah, I also do this all the time. So does everyone I know.

@NikolausDemmel
Copy link
Member

What exactly is it which breaks? Not to get the discussion off-track, but rebuilding after sourcing is absolutely my usage model.

This was discussed here: #10 (comment). Apparently it does not come up in practice much, but the potential for very subtle conflicts is there (I guess if such conflicts occur, they are seldomly identified because the user would wipe build/devel and retry before analysing some strange behaviour and the retry would then work).

@jbohren
Copy link
Contributor

jbohren commented Jun 11, 2014

@mikepurvis What exactly is it which breaks? Not to get the discussion off-track, but rebuilding after sourcing is absolutely my usage model.

@NikolausDemmel This was discussed here: #10 (comment). Apparently it does not come up in practice much, but the potential for very subtle conflicts is there (I guess if such conflicts occur, they are seldomly identified because the user would wipe build/devel and retry before analysing some strange behaviour and the retry would then work).

Ah yeah that. Of course this is only an issue with the merged-build mode that catkin_make uses and isn't an issue for catkin_make_isolated or catkin build. Thibault goes on to say the following, and I believe he's correct:

@tkruse Unless I am mistaken, catkin_make_isolated and the parallel version do not suffer from those problems, as each package has a sane minimal build environment of it's own.

@NikolausDemmel
Copy link
Member

I don't think I fully understand this myself, however William responded to that comment of Thibault:

Unless I am mistaken, catkin_make_isolated and the parallel version do not suffer from those problems, as each package has a sane minimal build environment of it's own.

catkin build builds the environment for each package by starting with the environment in which the command was run and then running source /path/to/setup.sh --extend for each of the package's direct build and run dependencies. This only applies to the non-install and the --install --isolate-install cases, so with only the --install option, each package simply takes the environment the command was run in and adds source /path/to/install/setup.sh --extend before building.

I'm not exactly sure what implications there are for sourcing a devel space and then running catkin build again.

That to me sounds like the minimal build environment is only clean from the old version of the currently building package as well as from the undeclared depenencies, if --isolate-install or --isolate-devel is used and the current ws setup file has not been sourced. The only other scenario AFAICS that avoids at least the "building against old version of the same package" is if you start a build from scratch with fresh devel/install folder (here it doesn't matter if the setup files of the current ws have been sourced). IMO the same issue can also happen when building a plain old cmake package that you have previously installed system-wide.

I have no idea if and how this issue needs to be addressed or should be better ignored.

@wjwwood
Copy link
Member Author

wjwwood commented Jun 11, 2014

It can be a problem to build -> source -> build, consider this:

  • catkin build --install
  • source ./install/setup.bash
  • You remove a header from the source space, but don't update uses of it
    • You would expect the build to fail on a missing header you just removed
  • Build passes because the header is in the install space you just sourced

This is subtle, but minor, and can also occur when you have package foo installed via deb, source the debs, then build foo from source such that it overlays the one from debs and remove the header file there. It will still be found in the deb version of foo. In this way overlaying FHS compliant spaces is not airtight.

At first thought you imagine we could just not allow people to build a workspace on top of itself, something like if target space is in CPP, fail, but then you have to consider the workflow of building multiple workspaces, on top of each other, and installing them to the same place. That is a valid workflow, but would be broken by the above error checking logic. I've spent some time discussing and thinking about this problem, but my conclusion has been that it is very difficult to detect and in practice doesn't cause many problems.

That being said, I usually try to keep one terminal for building and another for running, which has become a habit.

@NikolausDemmel
Copy link
Member

I think the only real way around this would be a pattern like homebrew's superenv for building packages. I.e. for building each package, the environment is stripped clean of any build related variables and then piece by piece only the needed bits (as by declared dependencies) are added. With that, it wouldn't matter what the build environment looks like in the shell the user invokes catkin build (except for possible implicit workspace chaining). In fact, the setup files the user sources would not even need to export any build related environment variables, only enough to run packages, and for catkin build to know where to get needed information about dependencies from (i.e. the chained workspaces). It would also only have its full inpact if result spaces where isolated by default, possibly with a homebrew Cellar like structure where they are still symlinked into a common fhs for easy consumption in the "run packages from this result space" use case.

Sounds like a lot of work and it is unclear if it is possible at all, but ultimately that could lead to much cleaner and more reliable builds.

@wjwwood
Copy link
Member Author

wjwwood commented Jun 16, 2014

I am familiar with Homebrew's superenv system (I helped fix some issues with it when they were first rolling it out), and I have often thought of adopting a similar mechanism for our build tool, but it seems to me that it is something which is difficult to get right, but not impossible. Also it is a significant amount of "magic", which when it goes wrong is incredibly difficult to debug, so you really have to get it right.

@davetcoleman
Copy link
Contributor

Seems like catkin_tools now has most of this build summary...

[build] Summary: All 25 packages succeeded!
[build] Ignored: 4 packages were skipped or are blacklisted.
[build] Warnings: None.
[build] Abandoned: No packages were abandoned.
[build] Failed: No packages failed.
[build] Runtime: 2 minutes and 32.2 seconds total.

Should this issue be closed?

@wjwwood wjwwood closed this as completed Apr 6, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants