Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I get a NegativeArraySizeException when loading a PLINK file #2264

Open
iris-garden opened this issue Apr 30, 2024 · 0 comments
Open

I get a NegativeArraySizeException when loading a PLINK file #2264

iris-garden opened this issue Apr 30, 2024 · 0 comments
Labels
discourse-old migrated from discuss.hail.is (last updated more than 31 days ago)

Comments

@iris-garden
Copy link
Owner

danking said:

Hi! I get a NegativeArraySizeException when I load a PLINK file.

import hail as hl
mt = hl.import_plink('/path/to/plink-file')
mt.count_rows()

Some of the stack trace looks like this:

        at com.esotericsoftware.kryo.util.IdentityObjectIntMap.resize(IdentityObjectIntMap.java:427)
        at com.esotericsoftware.kryo.util.IdentityObjectIntMap.putStash(IdentityObjectIntMap.java:227)
        at com.esotericsoftware.kryo.util.IdentityObjectIntMap.push(IdentityObjectIntMap.java:221)
        at com.esotericsoftware.kryo.util.IdentityObjectIntMap.put(IdentityObjectIntMap.java:117)
        at com.esotericsoftware.kryo.util.IdentityObjectIntMap.putStash(IdentityObjectIntMap.java:228)
        at com.esotericsoftware.kryo.util.IdentityObjectIntMap.push(IdentityObjectIntMap.java:221)
        at com.esotericsoftware.kryo.util.IdentityObjectIntMap.put(IdentityObjectIntMap.java:117)

How can I fix this?

danking said:

This is caused by an issue with two libraries we use Spark and Kryo, we’re working on a longterm fix. For now, you can try starting your spark cluster with an extra properties argument:

--properties 'spark:spark.executor.extraJavaOptions=-XX:hashCode=0,spark:spark.driver.extraJavaOptions=-XX:hashCode=0;

If you are running locally, you can try:

export PYSPARK_SUBMIT_ARGS="--driver-java-options '-XX:hashCode=0' --conf 'spark.executor.extraJavaOptions=-XX:hashCode=0' pyspark-shell"
ipython # or jupyter notebook
@iris-garden iris-garden added the discourse-old migrated from discuss.hail.is (last updated more than 31 days ago) label Apr 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discourse-old migrated from discuss.hail.is (last updated more than 31 days ago)
Projects
None yet
Development

No branches or pull requests

1 participant