You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Expected: Pipeline runs to completion without errors
Actual: Crashes due to open queue linked to collectfile
Steps to reproduce the problem
Github repo here: [https://github.com/APHA-CSU/btb-seq/tree/s3direct]
Run as follows: nextflow run APHA-CSU/btb-seq -r s3direct --reads="s3://s3-bucket/*_{S*_R1,S*_R2}*.fastq.gz"
The pipeline runs as expected, but fails as the CombineOutput process cannot be initiated. The output attached below indicates that one of the collectFile operator queues is remains open. The offending snippet is here.
Caused by:
Oops.. something wrong happened while creating task 'CombineOutput' unique id -- Offending keys: [
- type=java.util.UUID value=0ad5d850-cdf0-4713-9f76-81413e61fdd0,
- type=java.lang.String value=CombineOutput,
- type=java.lang.String value="""
combineCsv.py Assigned.csv Qbovis.csv ${params.DataDir}
"""
,
- type=java.lang.String value=Assigned.csv,
- type=nextflow.util.ArrayBag value=[FileHolder(sourceObj:/processing/TestDec21/work/stage/6e/9e76fa42efffae5a0de0ec944d13af/NB552234_0093_AssignedWGSCluster_17Dec21.csv, storePath:/processing/TestDec21/work/stage/6e/9e76fa42efffae5a0de0ec944d13af/NB552234_0093_AssignedWGSCluster_17Dec21.csv, stageName:Assigned.csv)],
- type=java.lang.String value=Qbovis.csv,
- type=nextflow.util.ArrayBag value=[FileHolder(sourceObj:/processing/TestDec21/work/stage/6a/18d54fffd943dfbcb59fc063a57a77/NB552234_0093_BovPos_17Dec21.csv, storePath:/processing/TestDec21/work/stage/6a/18d54fffd943dfbcb59fc063a57a77/NB552234_0093_BovPos_17Dec21.csv, stageName:Qbovis.csv)],
- type=java.lang.String value=$,
- type=java.lang.Boolean value=true,
- type=java.util.HashMap$EntrySet value=[params.DataDir=NB552234_0093],
- type=sun.nio.fs.UnixPath value=/home/richardellis/btb-seq-2dev/btb-seq/bin/combineCsv.py]
nextflow.exception.UnexpectedException: Oops.. something wrong happened while creating task 'CombineOutput' unique id -- Offending keys: [
- type=java.util.UUID value=0ad5d850-cdf0-4713-9f76-81413e61fdd0,
- type=java.lang.String value=CombineOutput,
- type=java.lang.String value="""
combineCsv.py Assigned.csv Qbovis.csv ${params.DataDir}
"""
,
- type=java.lang.String value=Assigned.csv,
- type=nextflow.util.ArrayBag value=[FileHolder(sourceObj:/processing/TestDec21/work/stage/6e/9e76fa42efffae5a0de0ec944d13af/NB552234_0093_AssignedWGSCluster_17Dec21.csv, storePath:/processing/TestDec21/work/stage/6e/9e76fa42efffae5a0de0ec944d13af/NB552234_0093_AssignedWGSCluster_17Dec21.csv, stageName:Assigned.csv)],
- type=java.lang.String value=Qbovis.csv,
- type=nextflow.util.ArrayBag value=[FileHolder(sourceObj:/processing/TestDec21/work/stage/6a/18d54fffd943dfbcb59fc063a57a77/NB552234_0093_BovPos_17Dec21.csv, storePath:/processing/TestDec21/work/stage/6a/18d54fffd943dfbcb59fc063a57a77/NB552234_0093_BovPos_17Dec21.csv, stageName:Qbovis.csv)],
- type=java.lang.String value=$,
- type=java.lang.Boolean value=true,
- type=java.util.HashMap$EntrySet value=[params.DataDir=NB552234_0093],
- type=sun.nio.fs.UnixPath value=/home/richardellis/btb-seq-2dev/btb-seq/bin/combineCsv.py]
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:72)
at org.codehaus.groovy.reflection.CachedConstructor.doConstructorInvoke(CachedConstructor.java:59)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrap.callConstructor(ConstructorSite.java:84)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:263)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:286)
at nextflow.processor.TaskProcessor.computeHash(TaskProcessor.groovy:1988)
at nextflow.processor.TaskProcessor$computeHash$5.callCurrent(Unknown Source)
at nextflow.processor.TaskProcessor.createTaskHashKey(TaskProcessor.groovy:1975)
at jdk.internal.reflect.GeneratedMethodAccessor220.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:193)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:61)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:185)
at nextflow.processor.TaskProcessor.invokeTask(TaskProcessor.groovy:591)
at nextflow.processor.InvokeTaskAdapter.call(InvokeTaskAdapter.groovy:59)
at groovyx.gpars.dataflow.operator.DataflowOperatorActor.startTask(DataflowOperatorActor.java:120)
at groovyx.gpars.dataflow.operator.ForkingDataflowOperatorActor.access$001(ForkingDataflowOperatorActor.java:35)
at groovyx.gpars.dataflow.operator.ForkingDataflowOperatorActor$1.run(ForkingDataflowOperatorActor.java:58)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.IllegalArgumentException: The bucket name parameter must be specified when listing objects in a bucket
at com.amazonaws.services.s3.AmazonS3Client.rejectNull(AmazonS3Client.java:3839)
at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:861)
at com.upplication.s3fs.AmazonS3Client.listObjects(AmazonS3Client.java:105)
at com.upplication.s3fs.util.S3ObjectSummaryLookup.lookup(S3ObjectSummaryLookup.java:113)
at com.upplication.s3fs.S3FileSystemProvider.readAttributes(S3FileSystemProvider.java:669)
at java.base/java.nio.file.Files.readAttributes(Files.java:1764)
at nextflow.util.CacheHelper.hashFile(CacheHelper.java:239)
at nextflow.util.CacheHelper.hasher(CacheHelper.java:186)
at nextflow.util.CacheHelper.hasher(CacheHelper.java:169)
at nextflow.util.CacheHelper.hasher(CacheHelper.java:111)
at nextflow.util.CacheHelper.hasher(CacheHelper.java:107)
at nextflow.util.CacheHelper.hashUnorderedCollection(CacheHelper.java:376)
at nextflow.util.CacheHelper.hasher(CacheHelper.java:174)
at nextflow.util.CacheHelper.hasher(CacheHelper.java:178)
at nextflow.util.CacheHelper.hasher(CacheHelper.java:111)
at nextflow.util.CacheHelper.hasher(CacheHelper.java:107)
at nextflow.util.CacheHelper$hasher.call(Unknown Source)
at nextflow.processor.TaskProcessor.computeHash(TaskProcessor.groovy:1984)
... 17 common frames omitted
Dec-17 13:23:18.336 [Actor Thread 55] DEBUG nextflow.Session - Session aborted -- Cause: Oops.. something wrong happened while creating task 'CombineOutput' unique id -- Offending keys: [
- type=java.util.UUID value=0ad5d850-cdf0-4713-9f76-81413e61fdd0,
- type=java.lang.String value=CombineOutput,
- type=java.lang.String value="""
combineCsv.py Assigned.csv Qbovis.csv ${params.DataDir}
"""
,
- type=java.lang.String value=Assigned.csv,
- type=nextflow.util.ArrayBag value=[FileHolder(sourceObj:/processing/TestDec21/work/stage/6e/9e76fa42efffae5a0de0ec944d13af/NB552234_0093_AssignedWGSCluster_17Dec21.csv, storePath:/processing/TestDec21/work/stage/6e/9e76fa42efffae5a0de0ec944d13af/NB552234_0093_AssignedWGSCluster_17Dec21.csv, stageName:Assigned.csv)],
- type=java.lang.String value=Qbovis.csv,
- type=nextflow.util.ArrayBag value=[FileHolder(sourceObj:/processing/TestDec21/work/stage/6a/18d54fffd943dfbcb59fc063a57a77/NB552234_0093_BovPos_17Dec21.csv, storePath:/processing/TestDec21/work/stage/6a/18d54fffd943dfbcb59fc063a57a77/NB552234_0093_BovPos_17Dec21.csv, stageName:Qbovis.csv)],
- type=java.lang.String value=$,
- type=java.lang.Boolean value=true,
- type=java.util.HashMap$EntrySet value=[params.DataDir=NB552234_0093],
- type=sun.nio.fs.UnixPath value=/home/richardellis/btb-seq-2dev/btb-seq/bin/combineCsv.py]
Dec-17 13:23:18.351 [Actor Thread 55] DEBUG nextflow.Session - The following nodes are still active:
[process] CombineOutput
status=ACTIVE
port 0: (queue) closed; channel: Assigned.csv
port 1: (queue) OPEN ; channel: Qbovis.csv
port 2: (cntrl) - ; channel: $
Environment
Nextflow version: 21.04.1
Java version: openjdk version "11.0.11"
Operating system: ubuntu 18.04.4 LTS
Bash version: GNU bash, version 4.4.20(1)-release (x86_64-pc-linux-gnu)
Additional context
Whilst this appears to be linked to collectFile, I think its more likely to be an upstream process linked to external downloads, and so I think this may be related to #1230 and possibly nf-core/sarek#130. This is because I can run the exact same process when the (exact same) input files are stored locally and the issue does not occur.
The text was updated successfully, but these errors were encountered:
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Bug report
Expected behavior and actual behavior
Expected: Pipeline runs to completion without errors
Actual: Crashes due to open queue linked to collectfile
Steps to reproduce the problem
Github repo here: [https://github.com/APHA-CSU/btb-seq/tree/s3direct]
Run as follows:
nextflow run APHA-CSU/btb-seq -r s3direct --reads="s3://s3-bucket/*_{S*_R1,S*_R2}*.fastq.gz"
The pipeline runs as expected, but fails as the CombineOutput process cannot be initiated. The output attached below indicates that one of the collectFile operator queues is remains open. The offending snippet is here.
Program output
error.nextflow.log
Specifically, the error is shown here:
Environment
Additional context
Whilst this appears to be linked to collectFile, I think its more likely to be an upstream process linked to external downloads, and so I think this may be related to #1230 and possibly nf-core/sarek#130. This is because I can run the exact same process when the (exact same) input files are stored locally and the issue does not occur.
The text was updated successfully, but these errors were encountered: