-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Kryo 3.0.3 upgrade #245
Kryo 3.0.3 upgrade #245
Conversation
@@ -117,8 +117,8 @@ object Instantiators { | |||
} | |||
|
|||
// Use call by name: | |||
def forClass(t: Class[_])(fn: () => Any): ObjectInstantiator = | |||
new ObjectInstantiator { | |||
def forClass(t: Class[_])(fn: () => Any): ObjectInstantiator[AnyRef] = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't this be:
def forClass[T](t: Class[T])(fn: () => T): ObjectInstantiator[T] =
new ObjectInstantiator[T] {
override def newInstance() = {
try { fn() }
looks good to me. Ian, what are the issues with using this with summingbird, which hits storm? Have they shaded their kryo? |
No, not shaded, at twitter i've talked to the heron folks about it. External storm users are going to need carbonite upgraded iirc. |
but will our spark tests fail? I'd really hate to break external users of summingbird. |
spark tests? Pretty much every downstream test will fail until fixed for this sort of change. Its pretty much unavoidable though or we can never upgrade kryo :/ Our storm tests in summingbird are the most at risk unit tests to fail that are tricky to fix that I know of. Spark doesn't require kryo usage so until spark upgrades our tests can turn off those paths |
yeah, I meant summingbird tests. I guess we have to start moving up the stack. |
FWIW, here's the documented experience of the hive move HIVE-12175 |
I'm trying to get kyro3 upgraded in upcoming Spark2.0. Is it possible to get this in maven repository so I can update the pom.xml file in Spark? I'm already testing in local env but would like to create a branch at Spark for this. |
@ozawa-hi +1 the time it now to go forward. |
@@ -46,7 +46,7 @@ public void open(InputStream in) throws IOException { | |||
|
|||
public Object deserialize(Object o) throws IOException { | |||
// TODO, we could share these buffers if we see that alloc is bottlenecking | |||
byte[] bytes = new byte[inputStream.readInt()]; | |||
byte[] bytes = new byte[(int) Varint.readUnsignedVarLong(inputStream)]; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there no int? should we throw here if the long > Int.MaxValue
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, the code is below. Why not unsigned varint?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this flowed out of the kryo api for bytes written returning a long... at the entry point to that i goto an int now, so everything else just ends up being integers
Spark upgraded their Scala on Master branch to 2.11. Is it possible to change scalaVersion := "2.11.7", in Build.scala? |
We publish all these projects for both 2.10 and 2.11, no reason why we would change the default build. Doesn't effect anything but chill itself |
https://github.com/apache/storm/blob/master/pom.xml#L231 storm is on 2.21. Not sure what that means. Maybe that this will break summingbird-storm users. |
https://issues.apache.org/jira/browse/HIVE-11269?jql=text%20~%20%22kryo%22 what a pain we have without isolated classpaths. |
https://github.com/sritchie/carbonite/blob/master/project.clj#L8 still in play: https://github.com/apache/storm/blob/02a44c7fc1b7b3a1571b326fde7bcae13e1b5c8d/pom.xml#L210 actually, transitively, storm depends on chill 0.3.5: |
so, one way forward may be:
in any case, I think publishing either a new artifact or a new version is the way to unblock spark. |
Storm look down to do the update if we publish a new version based on your JIRA there. Should we just push ahead and do a bump? Scalding won't take it till 0.17 anyway, so we have some runway for storm to update everything else. (Summingbird will need to update to a newer storm at the same time). Once we merge this call it 0.8.0 or so? |
@ianoc sounds good to me. |
Ok awesome, sounds like a plan. Any other changes you'd like to this PR pre-merge? The other active one can be applied to the 0.7.x series and 0.8.x series presuming all tests are good there, so i'll try put it against both. |
Forked from #230
Remove the java version limits, we support from 1.6+ still