Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

decouple inference from Base #11274

Merged
merged 9 commits into from
May 20, 2015
Merged

decouple inference from Base #11274

merged 9 commits into from
May 20, 2015

Conversation

vtjnash
Copy link
Member

@vtjnash vtjnash commented May 14, 2015

having inference inside Base made it hard to modify and slow to compile the sysimg since bootstrapping made two complete passes over the code. additionally, it made inference vulnerable to sabotage -- intentional or unintentional -- by the users program.

to combat those issues, this creates a copy of the essential code required for running the inference into Core.Inference. this allows any part of the system (esp. Base) around inference to change, and even be recompiled or monkey-patched, without impacting the core inference algorithms.

@carnaval this also implies that you can build sysimg.jl without Core.Inference present, and that you can change the inference module in use at runtime.

inspired by @ihnorton's subversive activities in #11148

@@ -1164,7 +1164,7 @@ foo4129(a::Baz4129,b::Bar41291,args...) = foo4129(a,b.f,b,args...)
foo4129(a::Baz4129,b::Bar41292,args...) = foo4129(a,b.f,b,args...)
foo4129(a::Baz4129,args...) = foo4129(a,a.b,args...)

@test isa(foo4129(Baz4129(Bar41291(Foo4129())),1,2), Tuple{Baz4129,Bar4129,Foo4129,Int,Int})
#@test isa(foo4129(Baz4129(Bar41291(Foo4129())),1,2), Tuple{Baz4129,Bar4129,Foo4129,Int,Int})
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@carnaval any idea why this change would have broken this test and reintroduced #4129?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nevermind, i made a silly typo

@timholy
Copy link
Member

timholy commented May 14, 2015

💯 Will make it much easier to make disruptive changes, where getting inference working can be the main bottleneck.

I bet this speeds make clean && make quite a bit, as well as changes to files that aren't included in coreimg.jl.

@tkelman
Copy link
Contributor

tkelman commented May 14, 2015

strange that julia-debug is segfaulting on the travis branch build but not the pr merge build?

@vtjnash vtjnash changed the title WIP: decouple inference from Base decouple inference from Base May 15, 2015
@tkelman
Copy link
Contributor

tkelman commented May 15, 2015

Some seemingly unrelated changes to inlining costs here?

Might be nice to not let contrib/build_{sysimg,executable}.jl bitrot too badly.

@vtjnash
Copy link
Member Author

vtjnash commented May 15, 2015

Some seemingly unrelated changes to inlining costs here?

i removed support for floating point numbers and rewrote the algorithm in terms of integers. it should give the same answers however.

Might be nice to not let contrib/build_{sysimg,executable}.jl bitrot too badly.

good point. in general, we might want to actually distribute the inference.ji file. although bootstrapping off an existing sys.ji file may also work.

@tkelman
Copy link
Contributor

tkelman commented May 15, 2015

i removed support for floating point numbers and rewrote the algorithm in terms of integers

Oh. Right. Not so unrelated then.

@tkelman
Copy link
Contributor

tkelman commented May 15, 2015

Looks like this cuts off a good 2.5 minutes off of each AppVeyor build. Not too shabby.

@carnaval
Copy link
Contributor

Had a quick look. The inference changes look reasonable (i.e. no semantic changes). I'm mostly afraid of all the potential places where it could be assumed that whatever the top module is, it is at least always the same. The possibilities of more flexible inference and faster bootstrap are very enticing however.

@StefanKarpinski
Copy link
Member

Is there any way to know about the top module assumptions other than merging and seeing what happens? Btw, I've been in favor of changing the top module from Main to a separate Root module for some time now (which is ironic since that's how Jeff originally had it and I convinced him to just use Main as the root).

@@ -1282,27 +1282,7 @@ jl_value_t *jl_static_eval(jl_value_t *ex, void *ctx_, jl_module_t *mod,
jl_value_t *f = jl_static_eval(jl_exprarg(e,0),ctx,mod,sp,ast,sparams,allow_alloc);
if (f && jl_is_function(f)) {
jl_fptr_t fptr = ((jl_function_t*)f)->fptr;
if (fptr == &jl_apply_generic) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's up with this case getting removed?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this was support for a pre 0.1 deprecation from when ccall could only handle constant pointers. i removed jl_base_module, so i figured it was sensible to remove this special case also.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, can this go on master to cut down the diff?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. Almost all of this diff is valid independent of moving inference itself so I could rearrange into many commits, or leave as one

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you have a preference, or can i just merge this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd rather this be split out as a separate commit to keep commits atomic.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how's this now, better?

i think so anyways.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, thanks.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I quite like the granularity now, and the really descriptive commit messages are awesome too.

@vtjnash
Copy link
Member Author

vtjnash commented May 15, 2015

I'm mostly afraid of all the potential places where it could be assumed that whatever the top module is, it is at least always the same

Is there any way to know about the top module assumptions other than merging and seeing what happens?

the top is a module that is assumed to provide certain functions required for basic operation. there is no documented list of these, but there are a few in inference.jl itself (to provide basic constant folding for some important operations) and a few in julia-parser (such as unsafe_cconvert). there is some inconsistency in that sometimes it is used to refer to functionality that is in Core, sometimes it refers to functionality that must be provided by Base for the base language spec to make sense, sometimes it refers to crucial functions that inference needs to optimize.

i think that this change actually doesn't need to alter that situation very much. it does however allow for the existence of multiple independent top modules, which hopefully this will make it clearer how to make the distinction between base and top in the future.

@vtjnash vtjnash force-pushed the jn/coreimg branch 2 times, most recently from bd96e25 to 3218a0f Compare May 16, 2015 05:05
@vtjnash
Copy link
Member Author

vtjnash commented May 17, 2015

travis broken because of 7ec501d6#commitcomment-11228450

@tkelman
Copy link
Contributor

tkelman commented May 19, 2015

Is Warning: replacing module Inference expected?

@vtjnash
Copy link
Member Author

vtjnash commented May 19, 2015

Yes. And it ooks like this (and #11343) are ready to merge.

@ihnorton
Copy link
Member

+:100:

@tkelman
Copy link
Contributor

tkelman commented May 19, 2015

Odd to have a warning during normal expected operation. That'll confuse people.

vtjnash added 5 commits May 19, 2015 23:32
this was a deprecation to support efficient code generation for ccall(dlsym(x,dlopen(y)),...) from pre-0.1
this syntax is not encouraged and has not been in use for several years
making inference separate from base required modifying the definition of
TopNode to be more general, since Base may not even exist at points and
it is desirable for Core.Inference to be wholly separate from Base.
rather than special-casing Base, this introduces the concept that any
module can indicate that it is valid as a top module and will define
appropriate functions, as expected by the julia parser, to make this
legal. At the start of the module definition,
modules can ccall jl_set_istopmod(isprimary) to indicate this.
The primary top module will be used to resolve the top module for any
code that is not contained in another top module.
Only one module can be the primary top module at a time,
typically it will be Base.
… and t_ffunc::Array (for builtin functions)

this removes the dependence on being able to serialize ObjectIdDict from the inference code
ihnorton added a commit that referenced this pull request May 20, 2015
@ihnorton ihnorton merged commit 4c2f908 into master May 20, 2015
@ihnorton
Copy link
Member

YOLO

@sjkelly
Copy link
Contributor

sjkelly commented May 20, 2015

YOMO! (You Only Merge Once)

@ihnorton
Copy link
Member

Cross-referencing #5155 as this makes the "modularize base in-place" discussion vastly more approachable.

(@sjkelly indeed 😄)

@tkelman
Copy link
Contributor

tkelman commented May 20, 2015

We should really get rid of that warning.

@ihnorton
Copy link
Member

We should really get rid of that warning.

@tkelman done (1b6f9be)

@timholy
Copy link
Member

timholy commented May 29, 2015

I notice that when one is messing with files that are necessary for inference (e.g., tuple.jl), inference is always compiled twice. I'm not sure that was true formerly---if you didn't have to resort to make clean I think make would just compile things once. Is this more correct, or something that could be further optimized (safely)?

@tkelman
Copy link
Contributor

tkelman commented May 29, 2015

Maybe if inference0.ji already exists, then that step could be made to skip the first stage build?

@vtjnash
Copy link
Member Author

vtjnash commented May 29, 2015

It's more correct, although rarely necessary (unless it impacts the results of inference). But I didn't altogether mind discouraging altering of those files, and even eventually shrinking them further :)

@rened rened mentioned this pull request May 31, 2015
simonster added a commit that referenced this pull request Sep 29, 2015
These were broken by #11274 but no one noticed. Fixes #13350.
simonster added a commit that referenced this pull request Sep 30, 2015
These were broken by #11274 but no one noticed. Fixes #13350.

(cherry picked from commit 7dfcf70)
ref #13355
simonster added a commit that referenced this pull request Oct 22, 2015
These were broken by #11274 but no one noticed. Fixes #13350.
simonster added a commit that referenced this pull request Oct 22, 2015
These were broken by #11274 but no one noticed. Fixes #13350.
simonster added a commit that referenced this pull request Oct 22, 2015
These were broken by #11274 but no one noticed. Fixes #13350.
bjarthur pushed a commit to bjarthur/julia that referenced this pull request Oct 27, 2015
These were broken by JuliaLang#11274 but no one noticed. Fixes JuliaLang#13350.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants