-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
External dependencies: json-cpp, fmtlib, ranges-v3, intx, z3, cvc4 #8860
Comments
We discussed this countless times on gitter, but apparently no record of that here. I would say this issue should not block any other issues in https://github.com/ethereum/solidity/projects/40, in particular #3851 and #6900. For those issues to proceed I think it is fine just to use Independently later we could clean up dependency handling. My personal preference is for git subtrees for reasonably sized projects (and intx, nlohmann-json, ranges-v3 all fit this bill). |
@axic the good thing about submodules is, you do not have to check them out and can still opt for system installed dependencies. This ticket is also quite old, and I'm now at some ticket where fmtlib actually does make sense (solidity LSP server) for server side logging. I think I'd just use submodules for now until someone complains in the PR, as I think I can always change that at a later point. p.s.: I favor submodules for the above reasons and can be also easily integrated into our CI. |
@christianparpart I am not sure I follow how a git submodule could "opt for system installed dependencies" ? |
Here's another idea: why not make our cmake config agnostic to where the libs come from and just support multiple ways of getting them as a separate step before running In #12077 we got a request to publish solc on conan and while I think that just having the executable there does not add much, having a conan config in the repo that one could optionally use to get dependencies would be pretty convenient. My proposal would be this:
|
@cameel on Windows CI (and me at home) we're already using vcpkg for installing dependencies. vcpkg is btw also available on all 3 major platforms (win, osx, linux). and now they do also support manifest files. Personally I'd prefer to not support too many package managers (or ways to tell a project how to find them). vcpkg luckily supports |
Conan does support I have a slight preference towards Conan (vcpkg is very simplistic from what I've read and I'm a bit afraid that it will pull in some heavy MS tooling with it) but I haven't really tried any of them in practice. As long as it's just one of the methods to get packages, I don't have anything against trying it out. |
I wasn't about to convince us all to hard-depend on vcpkg. It just happens to be used by CI on windows (and my personally on my windows). the solidity project does not have a single line of vcpkg outside of CircleCI files :) |
@chriseth's opinion from the chat:
|
A bit of discussion has started on this topic after my comment in one of the PRs (#11967 (comment)) so I'm reposting it here to keep things in context (especially my recent proposal to support
|
Since we're accumulating more and more dependencies that are downloaded at build time, I'm trying to make my case on this once more: I really think this is an extreme antipattern and we really shouldn't be doing that. There is two proper ways to deal with dependencies: one is not at all, i.e. let users install the dependency and have cmake find it - or if one really wants to avoid that, for small header-only dependencies, it may be acceptable to use git submodules. But I'd actually even prefer keeping those external as well. Doing any other crazy dependency management - be it hand-written or in one of the big evil "dependency manager"s - may seem like it provides convenience, but it's an absolute fallacy. If your code breaks with a new version of dependencies, the quicker you notice and fix it, the better - insulating oneself from that by pulling fixed versions just leads to ever growing chains of outdated dependencies. Also https://www.youtube.com/watch?v=sBP17HQAQjk comes to my mind: the people for which the build system should work best, i.e. the users of the build system, are, ideally, package maintainers, not end-users. And conversely, every end-user who may want to build from source needs to be able to install boost development packages anyways, so why not the others as well. |
@cameel Ah, yeah, ok, reposting that stuff was maybe the simpler alternative to re-summarizing it here again :-). |
I totally agree regarding npm - that ecosystem is just crazy and the massive dependency trees with a separate tiny lib for every single thing are not something I'd like us to replicate. On the other hand I think that having everything as a submodule/subtree is not great either. I mean, it's acceptable for tiny libs like
What do you think about #8860 (comment)? It would basically give us this but with extra possibilities. I agree that forcing a dependency manager would be bad for packagers so my proposal is now to allow but not require a manager. Basically only as a quick, simple and optional way for something that you can accomplish manually. We'd just have a manager config with dependency versions listed (or whole supported ranges, if possible) on the condition that any manager we support has to be unobtrusive - i.e. just install stuff in such a way that cmake can find it and move out of the way. We could also keep the option to download the libs in cmake (disabled by default), or, better, just extracted into a separate helper script. You could choose the manager, completely ignore it and use the script or ignore both and do things your way. |
Ok, sounds like we're generally on the same page about this then. (And yes, ideally I'd also want to avoid submodules and I'd even oppose pulling in boost or z3 like that - but as you say: it may be acceptable for the tiny header only libs - but only if we can't agree otherwise) I wouldn't strongly oppose stuff like adding a package manager config, although I'm also not enthusiastic about it - as far as I'm concerned all these package managers should go back to the hell they came from, so I personally don't want to play nice with any of them :-D - but on a more serious note and with less personal bias: if we add such a config, we need to keep it up to date and test it, etc. pp. which is additional overhead with questionable gain. I agree and am absolutely strongly for keeping cmake clean and agnostic about any of this (at most it should provide additional information in error messages when it doesn't find something, but that's it - apart from that it should be kept as close to its default behaviour as possible). And personally, I think we can keep it at just that. If we want to provide some kind of convenience on top of that (even though I think this is generally questionable), then as separate and non-invasive as possible, as you say. |
I'm pretty sure some people on the team would use it if it was available. At least I probably would - even if just to try it out and see if I like that workflow. Being able to easily install arbitrary versions only locally for a single project would be a nice benefit. It's not always practical to install dev stuff system-wide. I probably won't need it that much since we generally support bleeding edge versions and on Arch I have just that but it would be even more appealing on a non-rolling-release distro. I think these managers can't be that bad if they're used for the right job. I.e. to manage project dependencies and not your system.
Right. I was actually thinking mostly about the stuff that that we currently download in cmake. Z3, boost and other heavy stuff I'd rather leave up to the user as it is now. If you want it quick and dirty with least possible effort, use the dependency manager. Otherwise you probably have opinions on how you want stuff installed and don't want a script fiddling with your system package manager anyway. |
You don't need to install anything system wide to have cmake find it and you don't need a dependency manager for building things locally for a single project either :-). |
I know but installing them manually is usually a bit of a hassle. You have to install things one by one, figure out how to build them properly, etc. It's a custom process every time. With a dependency manager the build file for each lib is public so I still have that option but I can also just roll with it. It's also more uniform in how you manage these packages. You can easily rebuild or reinstall them wholesale. Same reason why I prefer using AUR in Arch Linux to just grabbing sources manually and building them. With how few dependencies we have in Solidity I guess it's pretty manageable without a dependency manager but I still like the automation in principle. |
I guess we could argue about this for all eternity :-). That being said, let me emphasize, that - while I don't like it - I'd still be fine with using any kind of dependency management (be it just the mechanism we use now or something more involved up to a full blown dependency manager - even though my dislike increases with that ;-)), as long as it's not the default. |
This issue has been marked as stale due to inactivity for the last 90 days. |
This issue has been marked as stale due to inactivity for the last 90 days. |
Hi everyone! This issue has been automatically closed due to inactivity. |
Situation:
ExternalProject_Add
mechanismfind_package
.Issues with
ExternalProject_Add
:make
requires a network connection and starts downloadingApart from boost, all the other dependencies except json-cpp (including nlohmann/json as potential json-cpp replacement) are header-only or nearly header-only.
When considering to add new dependencies we could:
ExternalProject_Add
mechanism despite its drawbacks.git submodules
for the new dependencies (or all dependencies).submodules
usesubtree
s.find_package
, while maintaining a betterìnstall_deps
script to locally install dependencies in the build tree.find_package
with other mechanisms as fallback.The text was updated successfully, but these errors were encountered: