Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Insert reportedVersion in list.json #21

Open
wants to merge 5 commits into
base: gh-pages
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions update
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ const path = require('path')
const semver = require('semver')
const ethUtil = require('ethereumjs-util')
const swarmhash = require('swarmhash')
const cp = require('child_process')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not used that often so I'd use full name for readability:

Suggested change
const cp = require('child_process')
const child_process = require('child_process')

cp reads like the command-line copying utility to me :)

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

very simple.😉


// This script updates the index files list.js and list.txt in the bin directory,
// as well as the soljson-latest.js files.
Expand All @@ -33,13 +34,22 @@ dirs.forEach(function (dir) {
return fs.readFileSync(path.join(__dirname, dir, file))
}

function readBuiltinVersion (file) {
// NOTE: should be using this, but it leaks memory
// return solc(require(path.join(__dirname, dir, file))).version()
const filename = path.join(__dirname, dir, file)
return cp.execSync(`/usr/bin/env node -e "var solc = require('${filename}'); console.log(solc.cwrap(('_solidity_version' in solc) ? 'solidity_version' : 'version', 'string', [])())"`).toString().trim()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like it's going to be very slow. When I was rewriting bytecode comparison scripts and I tried doing something like this, rerunning node for each compilation was very slow (ethereum/solidity#10165 (comment)). A loop in JS was one or two orders of magnitude faster.

Is it actually a leak or is it just that update itself is pretty bad about controlling its memory usage (ethereum/solidity#9956)? Because for the bytecode comparison I run prepare_report.js which compiles around 5k test cases in a loop, twice and it does not run out of memory in the t-bytecode-compare.yml action.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nodejs is bad at controlling memory. We set the loaded emscripten modules to empty reference, but it still won't be GC'd in time somehow. Perhaps it is not a problem anymore with the wasm binaries?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, right. It's about loading the compiler into memory, not running it. Bytecode comparison loads only one version.

Perhaps it is not a problem anymore with the wasm binaries?

We still need to run it on all those old asm.js nightlies so I think this does not help us even if it's fixed for wasm. Unless we just compute versions for asm.js once and cache them.

The script now has an option that makes it not recalculate hashes. It could have a similar one for this. The downside would be that if you replace the binary, CI won't update the versions so it would be safer to somehow combine it with detecting which binaries changed in the PR (which adds a bit of complexity to the whole setup).

}

// ascending list (oldest version first)
const parsedList = files
.map(function (file) { return file.match(/^soljson-v([0-9.]*)(-([^+]*))?(\+(.*))?.js$/) })
.filter(function (version) { return version })
.map(function (pars) { return { path: pars[0], version: pars[1], prerelease: pars[3], build: pars[5] } })
.map(function (pars) {
console.log('Processing ' + pars.path)
const fileContent = readFile(pars.path)
pars.reportedVersion = readBuiltinVersion(pars.path)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One problem with adding it now is that the repo contains binaries for multiple platforms and you won't be able to get versions for all of them in one run. We'd need separate runs for different platforms and then we would have to combine gathered info.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need an option for skipping this when running locally, just like you can skip hash recalculation.

pars.longVersion = buildVersion(pars)
pars.keccak256 = '0x' + ethUtil.sha3(fileContent).toString('hex')
pars.urls = [ 'bzzr://' + swarmhash(fileContent).toString('hex') ]
Expand Down