C:\programs\spark-dev>.\bin\spark-submit2.cmd --conf spark.hadoop.fs.defaultFS="file:///" R\pkg\tests\run-all.R SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/C:/programs/spark-dev/assembly/target/scala-2.11/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/C:/programs/hadoop-2.8.1/naxExtraJars/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Warning message: package 'testthat' was built under R version 3.4.3 Loading required package: methods Attaching package: 'SparkR' The following objects are masked from 'package:testthat': describe, not The following objects are masked from 'package:stats': cov, filter, lag, na.omit, predict, sd, var, window The following objects are masked from 'package:base': as.data.frame, colnames, colnames<-, drop, endsWith, intersect, rank, rbind, sample, startsWith, subset, summary, transform, union Spark package found in SPARK_HOME: C:\programs\spark-dev\R\.. 18/01/22 15:38:47 INFO spark.SparkContext: Running Spark version 2.3.0-SNAPSHOT 18/01/22 15:38:47 INFO spark.SparkContext: Submitted application: SparkR 18/01/22 15:38:47 INFO spark.SecurityManager: Changing view acls to: nam23 18/01/22 15:38:47 INFO spark.SecurityManager: Changing modify acls to: nam23 18/01/22 15:38:47 INFO spark.SecurityManager: Changing view acls groups to: 18/01/22 15:38:47 INFO spark.SecurityManager: Changing modify acls groups to: 18/01/22 15:38:47 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(nam23); groups with view permissions: Set(); users with modify permissions: Set(nam23); groups with modify permissions: Set() 18/01/22 15:38:47 INFO util.Utils: Successfully started service 'sparkDriver' on port 53037. 18/01/22 15:38:47 INFO spark.SparkEnv: Registering MapOutputTracker 18/01/22 15:38:47 INFO spark.SparkEnv: Registering BlockManagerMaster 18/01/22 15:38:47 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 18/01/22 15:38:47 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 18/01/22 15:38:47 INFO storage.DiskBlockManager: Created local directory at C:\Users\nam23\AppData\Local\Temp\blockmgr-2bc5eaa3-e068-4a44-b2ec-2f7f9dc77340 18/01/22 15:38:47 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB 18/01/22 15:38:47 INFO spark.SparkEnv: Registering OutputCommitCoordinator 18/01/22 15:38:47 INFO util.log: Logging initialized @4098ms 18/01/22 15:38:47 INFO server.Server: jetty-9.3.z-SNAPSHOT 18/01/22 15:38:47 INFO server.Server: Started @4216ms 18/01/22 15:38:47 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 18/01/22 15:38:47 INFO server.AbstractConnector: Started ServerConnector@7c9936a0{HTTP/1.1,[http/1.1]}{0.0.0.0:4041} 18/01/22 15:38:47 INFO util.Utils: Successfully started service 'SparkUI' on port 4041. 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2b0419f0{/jobs,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@419a13c4{/jobs/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7a785457{/jobs/job,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@188bebc3{/jobs/job/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@20e9ce69{/stages,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16cc831f{/stages/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@69c502b4{/stages/stage,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@320597e9{/stages/stage/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7394e3bf{/stages/pool,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6419beb8{/stages/pool/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3de7fdda{/storage,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@23438f50{/storage/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@771e099c{/storage/rdd,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5e5b95b9{/storage/rdd/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5cd0aa85{/environment,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63569740{/environment/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6483305f{/executors,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6677972b{/executors/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3f2a9de2{/executors/threadDump,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6891f1d0{/executors/threadDump/json,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@15ed2555{/static,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@f01f358{/,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2c8259a6{/api,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2e77f1ff{/jobs/job/kill,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6861a9cb{/stages/stage/kill,null,AVAILABLE,@Spark} 18/01/22 15:38:47 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-BBK0H95:4041 18/01/22 15:38:48 INFO spark.SparkContext: Added file file:/C:/programs/spark-dev/R/pkg/tests/run-all.R at file:/C:/programs/spark-dev/R/pkg/tests/run-all.R with timestamp 1516653528131 18/01/22 15:38:48 INFO util.Utils: Copying C:\programs\spark-dev\R\pkg\tests\run-all.R to C:\Users\nam23\AppData\Local\Temp\spark-6e66f881-4d3e-406e-ac32-346ed40bd635\userFiles-6e816f18-230f-4d5f-9ef9-ef564817091f\run-all.R 18/01/22 15:38:48 INFO executor.Executor: Starting executor ID driver on host localhost 18/01/22 15:38:48 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 53046. 18/01/22 15:38:48 INFO netty.NettyBlockTransferService: Server created on DESKTOP-BBK0H95:53046 18/01/22 15:38:48 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 18/01/22 15:38:48 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, DESKTOP-BBK0H95, 53046, None) 18/01/22 15:38:48 INFO storage.BlockManagerMasterEndpoint: Registering block manager DESKTOP-BBK0H95:53046 with 366.3 MB RAM, BlockManagerId(driver, DESKTOP-BBK0H95, 53046, None) 18/01/22 15:38:48 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, DESKTOP-BBK0H95, 53046, None) 18/01/22 15:38:48 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, DESKTOP-BBK0H95, 53046, None) 18/01/22 15:38:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@642e21d3{/metrics/json,null,AVAILABLE,@Spark} 18/01/22 15:38:49 INFO internal.SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/C:/programs/spark-dev/spark-warehouse/'). 18/01/22 15:38:49 INFO internal.SharedState: Warehouse path is 'file:/C:/programs/spark-dev/spark-warehouse/'. 18/01/22 15:38:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6921b3bd{/SQL,null,AVAILABLE,@Spark} 18/01/22 15:38:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@215aae{/SQL/json,null,AVAILABLE,@Spark} 18/01/22 15:38:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3f4de178{/SQL/execution,null,AVAILABLE,@Spark} 18/01/22 15:38:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b218d9d{/SQL/execution/json,null,AVAILABLE,@Spark} 18/01/22 15:38:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@70c084e{/static/sql,null,AVAILABLE,@Spark} 18/01/22 15:38:49 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 18/01/22 15:38:50 INFO spark.SparkContext: Starting job: collectPartitions at NativeMethodAccessorImpl.java:0 18/01/22 15:38:50 INFO scheduler.DAGScheduler: Got job 0 (collectPartitions at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:38:50 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (collectPartitions at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:50 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:38:50 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:38:50 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at RRDD.scala:151), which has no missing parents 18/01/22 15:38:50 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1424.0 B, free 366.3 MB) 18/01/22 15:38:50 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 961.0 B, free 366.3 MB) 18/01/22 15:38:50 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 961.0 B, free: 366.3 MB) 18/01/22 15:38:50 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:50 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at parallelize at RRDD.scala:151) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:50 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks 18/01/22 15:38:50 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7951 bytes) 18/01/22 15:38:50 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0) 18/01/22 15:38:50 INFO executor.Executor: Fetching file:/C:/programs/spark-dev/R/pkg/tests/run-all.R with timestamp 1516653528131 18/01/22 15:38:50 INFO util.Utils: C:\programs\spark-dev\R\pkg\tests\run-all.R has been previously copied to C:\Users\nam23\AppData\Local\Temp\spark-6e66f881-4d3e-406e-ac32-346ed40bd635\userFiles-6e816f18-230f-4d5f-9ef9-ef564817091f\run-all.R 18/01/22 15:38:50 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 860 bytes result sent to driver 18/01/22 15:38:50 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 353 ms on localhost (executor driver) (1/1) 18/01/22 15:38:50 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 18/01/22 15:38:50 INFO scheduler.DAGScheduler: ResultStage 0 (collectPartitions at NativeMethodAccessorImpl.java:0) finished in 0.661 s 18/01/22 15:38:50 INFO scheduler.DAGScheduler: Job 0 finished: collectPartitions at NativeMethodAccessorImpl.java:0, took 0.732263 s 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 5 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 24 18/01/22 15:38:52 INFO storage.BlockManagerInfo: Removed broadcast_0_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 961.0 B, free: 366.3 MB) 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 18 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 9 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 4 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 22 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 10 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 13 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 17 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 12 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 16 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 15 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 14 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 21 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 7 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 0 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 1 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 2 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 19 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 23 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 3 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 8 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 11 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 20 18/01/22 15:38:52 INFO spark.ContextCleaner: Cleaned accumulator 6 18/01/22 15:38:53 INFO codegen.CodeGenerator: Code generated in 276.316405 ms 18/01/22 15:38:53 INFO codegen.CodeGenerator: Code generated in 23.464456 ms 18/01/22 15:38:53 INFO spark.SparkContext: Starting job: count at NativeMethodAccessorImpl.java:0 18/01/22 15:38:53 INFO scheduler.DAGScheduler: Registering RDD 6 (count at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:53 INFO scheduler.DAGScheduler: Got job 1 (count at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:38:53 INFO scheduler.DAGScheduler: Final stage: ResultStage 2 (count at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:53 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 1) 18/01/22 15:38:53 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 1) 18/01/22 15:38:53 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 1 (MapPartitionsRDD[6] at count at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:38:53 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 17.9 KB, free 366.3 MB) 18/01/22 15:38:53 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 8.1 KB, free 366.3 MB) 18/01/22 15:38:53 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 8.1 KB, free: 366.3 MB) 18/01/22 15:38:53 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:53 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 1 (MapPartitionsRDD[6] at count at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:53 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks 18/01/22 15:38:53 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, PROCESS_LOCAL, 7940 bytes) 18/01/22 15:38:53 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1) 18/01/22 15:38:55 INFO codegen.CodeGenerator: Code generated in 31.982195 ms 18/01/22 15:38:55 INFO codegen.CodeGenerator: Code generated in 49.308846 ms 18/01/22 15:38:55 INFO r.RRunner: Times: boot = 0.338 s, init = 1.000 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.060 s, total = 1.418 s 18/01/22 15:38:55 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1548 bytes result sent to driver 18/01/22 15:38:55 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 1597 ms on localhost (executor driver) (1/1) 18/01/22 15:38:55 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 18/01/22 15:38:55 INFO scheduler.DAGScheduler: ShuffleMapStage 1 (count at NativeMethodAccessorImpl.java:0) finished in 1.649 s 18/01/22 15:38:55 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:38:55 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:38:55 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 2) 18/01/22 15:38:55 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Submitting ResultStage 2 (MapPartitionsRDD[9] at count at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:38:55 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 7.1 KB, free 366.3 MB) 18/01/22 15:38:55 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 3.7 KB, free 366.3 MB) 18/01/22 15:38:55 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 3.7 KB, free: 366.3 MB) 18/01/22 15:38:55 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (MapPartitionsRDD[9] at count at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:55 INFO scheduler.TaskSchedulerImpl: Adding task set 2.0 with 1 tasks 18/01/22 15:38:55 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:38:55 INFO executor.Executor: Running task 0.0 in stage 2.0 (TID 2) 18/01/22 15:38:55 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:38:55 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 17 ms 18/01/22 15:38:55 INFO executor.Executor: Finished task 0.0 in stage 2.0 (TID 2). 1825 bytes result sent to driver 18/01/22 15:38:55 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 89 ms on localhost (executor driver) (1/1) 18/01/22 15:38:55 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool 18/01/22 15:38:55 INFO scheduler.DAGScheduler: ResultStage 2 (count at NativeMethodAccessorImpl.java:0) finished in 0.131 s 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Job 1 finished: count at NativeMethodAccessorImpl.java:0, took 1.845828 s 18/01/22 15:38:55 INFO spark.SparkContext: Starting job: collectPartitions at NativeMethodAccessorImpl.java:0 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Got job 2 (collectPartitions at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Final stage: ResultStage 3 (collectPartitions at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Submitting ResultStage 3 (ParallelCollectionRDD[10] at parallelize at RRDD.scala:151), which has no missing parents 18/01/22 15:38:55 INFO memory.MemoryStore: Block broadcast_3 stored as values in memory (estimated size 1424.0 B, free 366.3 MB) 18/01/22 15:38:55 INFO memory.MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 959.0 B, free 366.3 MB) 18/01/22 15:38:55 INFO storage.BlockManagerInfo: Added broadcast_3_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 959.0 B, free: 366.3 MB) 18/01/22 15:38:55 INFO spark.SparkContext: Created broadcast 3 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 3 (ParallelCollectionRDD[10] at parallelize at RRDD.scala:151) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:55 INFO scheduler.TaskSchedulerImpl: Adding task set 3.0 with 1 tasks 18/01/22 15:38:55 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 3.0 (TID 3, localhost, executor driver, partition 0, PROCESS_LOCAL, 8040 bytes) 18/01/22 15:38:55 INFO executor.Executor: Running task 0.0 in stage 3.0 (TID 3) 18/01/22 15:38:55 INFO executor.Executor: Finished task 0.0 in stage 3.0 (TID 3). 863 bytes result sent to driver 18/01/22 15:38:55 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 3.0 (TID 3) in 9 ms on localhost (executor driver) (1/1) 18/01/22 15:38:55 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool 18/01/22 15:38:55 INFO scheduler.DAGScheduler: ResultStage 3 (collectPartitions at NativeMethodAccessorImpl.java:0) finished in 0.022 s 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Job 2 finished: collectPartitions at NativeMethodAccessorImpl.java:0, took 0.032219 s 18/01/22 15:38:55 INFO spark.SparkContext: Starting job: collectPartitions at NativeMethodAccessorImpl.java:0 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Got job 3 (collectPartitions at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Final stage: ResultStage 4 (collectPartitions at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Submitting ResultStage 4 (ParallelCollectionRDD[14] at parallelize at RRDD.scala:151), which has no missing parents 18/01/22 15:38:55 INFO memory.MemoryStore: Block broadcast_4 stored as values in memory (estimated size 1424.0 B, free 366.3 MB) 18/01/22 15:38:55 INFO memory.MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 959.0 B, free 366.3 MB) 18/01/22 15:38:55 INFO storage.BlockManagerInfo: Added broadcast_4_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 959.0 B, free: 366.3 MB) 18/01/22 15:38:55 INFO spark.SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 4 (ParallelCollectionRDD[14] at parallelize at RRDD.scala:151) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:55 INFO scheduler.TaskSchedulerImpl: Adding task set 4.0 with 1 tasks 18/01/22 15:38:55 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 4.0 (TID 4, localhost, executor driver, partition 0, PROCESS_LOCAL, 7982 bytes) 18/01/22 15:38:55 INFO executor.Executor: Running task 0.0 in stage 4.0 (TID 4) 18/01/22 15:38:55 INFO executor.Executor: Finished task 0.0 in stage 4.0 (TID 4). 762 bytes result sent to driver 18/01/22 15:38:55 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 4.0 (TID 4) in 9 ms on localhost (executor driver) (1/1) 18/01/22 15:38:55 INFO scheduler.DAGScheduler: ResultStage 4 (collectPartitions at NativeMethodAccessorImpl.java:0) finished in 0.021 s 18/01/22 15:38:55 INFO scheduler.DAGScheduler: Job 3 finished: collectPartitions at NativeMethodAccessorImpl.java:0, took 0.027913 s 18/01/22 15:38:55 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool 18/01/22 15:38:56 INFO spark.SparkContext: Starting job: count at NativeMethodAccessorImpl.java:0 18/01/22 15:38:56 INFO scheduler.DAGScheduler: Registering RDD 20 (count at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:56 INFO scheduler.DAGScheduler: Got job 4 (count at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:38:56 INFO scheduler.DAGScheduler: Final stage: ResultStage 6 (count at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:56 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 5) 18/01/22 15:38:56 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 5) 18/01/22 15:38:56 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 5 (MapPartitionsRDD[20] at count at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:38:56 INFO memory.MemoryStore: Block broadcast_5 stored as values in memory (estimated size 18.7 KB, free 366.2 MB) 18/01/22 15:38:56 INFO memory.MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 8.4 KB, free 366.2 MB) 18/01/22 15:38:56 INFO storage.BlockManagerInfo: Added broadcast_5_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 8.4 KB, free: 366.3 MB) 18/01/22 15:38:56 INFO spark.SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:56 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 5 (MapPartitionsRDD[20] at count at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:56 INFO scheduler.TaskSchedulerImpl: Adding task set 5.0 with 1 tasks 18/01/22 15:38:56 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 5.0 (TID 5, localhost, executor driver, partition 0, PROCESS_LOCAL, 7971 bytes) 18/01/22 15:38:56 INFO executor.Executor: Running task 0.0 in stage 5.0 (TID 5) 18/01/22 15:38:57 INFO codegen.CodeGenerator: Code generated in 18.713711 ms 18/01/22 15:38:57 INFO codegen.CodeGenerator: Code generated in 81.617185 ms 18/01/22 15:38:57 INFO r.RRunner: Times: boot = 0.217 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.030 s, total = 0.987 s 18/01/22 15:38:57 INFO executor.Executor: Finished task 0.0 in stage 5.0 (TID 5). 1548 bytes result sent to driver 18/01/22 15:38:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 5.0 (TID 5) in 1143 ms on localhost (executor driver) (1/1) 18/01/22 15:38:57 INFO scheduler.DAGScheduler: ShuffleMapStage 5 (count at NativeMethodAccessorImpl.java:0) finished in 1.162 s 18/01/22 15:38:57 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:38:57 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:38:57 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 6) 18/01/22 15:38:57 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Submitting ResultStage 6 (MapPartitionsRDD[23] at count at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:38:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 18/01/22 15:38:57 INFO memory.MemoryStore: Block broadcast_6 stored as values in memory (estimated size 7.1 KB, free 366.2 MB) 18/01/22 15:38:57 INFO memory.MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 3.7 KB, free 366.2 MB) 18/01/22 15:38:57 INFO storage.BlockManagerInfo: Added broadcast_6_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 3.7 KB, free: 366.3 MB) 18/01/22 15:38:57 INFO spark.SparkContext: Created broadcast 6 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 6 (MapPartitionsRDD[23] at count at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:57 INFO scheduler.TaskSchedulerImpl: Adding task set 6.0 with 1 tasks 18/01/22 15:38:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 6.0 (TID 6, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:38:57 INFO executor.Executor: Running task 0.0 in stage 6.0 (TID 6) 18/01/22 15:38:57 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:38:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms 18/01/22 15:38:57 INFO executor.Executor: Finished task 0.0 in stage 6.0 (TID 6). 1739 bytes result sent to driver 18/01/22 15:38:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 6.0 (TID 6) in 19 ms on localhost (executor driver) (1/1) 18/01/22 15:38:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool 18/01/22 15:38:57 INFO scheduler.DAGScheduler: ResultStage 6 (count at NativeMethodAccessorImpl.java:0) finished in 0.040 s 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Job 4 finished: count at NativeMethodAccessorImpl.java:0, took 1.216667 s 18/01/22 15:38:57 INFO codegen.CodeGenerator: Code generated in 15.87193 ms 18/01/22 15:38:57 INFO spark.SparkContext: Starting job: dfToCols at NativeMethodAccessorImpl.java:0 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Got job 5 (dfToCols at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Final stage: ResultStage 7 (dfToCols at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Submitting ResultStage 7 (MapPartitionsRDD[25] at dfToCols at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:38:57 INFO memory.MemoryStore: Block broadcast_7 stored as values in memory (estimated size 14.3 KB, free 366.2 MB) 18/01/22 15:38:57 INFO memory.MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 6.4 KB, free 366.2 MB) 18/01/22 15:38:57 INFO storage.BlockManagerInfo: Added broadcast_7_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 6.4 KB, free: 366.3 MB) 18/01/22 15:38:57 INFO spark.SparkContext: Created broadcast 7 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 7 (MapPartitionsRDD[25] at dfToCols at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:57 INFO scheduler.TaskSchedulerImpl: Adding task set 7.0 with 1 tasks 18/01/22 15:38:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 7.0 (TID 7, localhost, executor driver, partition 0, PROCESS_LOCAL, 7982 bytes) 18/01/22 15:38:57 INFO executor.Executor: Running task 0.0 in stage 7.0 (TID 7) 18/01/22 15:38:58 INFO r.RRunner: Times: boot = 0.219 s, init = 0.710 s, broadcast = 0.010 s, read-input = 0.000 s, compute = 0.000 s, write-output = 0.030 s, total = 0.969 s 18/01/22 15:38:58 INFO executor.Executor: Finished task 0.0 in stage 7.0 (TID 7). 1157 bytes result sent to driver 18/01/22 15:38:58 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 7.0 (TID 7) in 986 ms on localhost (executor driver) (1/1) 18/01/22 15:38:58 INFO scheduler.DAGScheduler: ResultStage 7 (dfToCols at NativeMethodAccessorImpl.java:0) finished in 1.005 s 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Job 5 finished: dfToCols at NativeMethodAccessorImpl.java:0, took 1.013426 s 18/01/22 15:38:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 7.0, whose tasks have all completed, from pool 18/01/22 15:38:58 INFO spark.SparkContext: Starting job: collectPartitions at NativeMethodAccessorImpl.java:0 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Got job 6 (collectPartitions at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Final stage: ResultStage 8 (collectPartitions at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Submitting ResultStage 8 (ParallelCollectionRDD[26] at parallelize at RRDD.scala:151), which has no missing parents 18/01/22 15:38:58 INFO memory.MemoryStore: Block broadcast_8 stored as values in memory (estimated size 1424.0 B, free 366.2 MB) 18/01/22 15:38:58 INFO memory.MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 959.0 B, free 366.2 MB) 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Added broadcast_8_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 959.0 B, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.SparkContext: Created broadcast 8 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 8 (ParallelCollectionRDD[26] at parallelize at RRDD.scala:151) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:58 INFO scheduler.TaskSchedulerImpl: Adding task set 8.0 with 1 tasks 18/01/22 15:38:58 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 8.0 (TID 8, localhost, executor driver, partition 0, PROCESS_LOCAL, 13759 bytes) 18/01/22 15:38:58 INFO executor.Executor: Running task 0.0 in stage 8.0 (TID 8) 18/01/22 15:38:58 INFO executor.Executor: Finished task 0.0 in stage 8.0 (TID 8). 6567 bytes result sent to driver 18/01/22 15:38:58 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 8.0 (TID 8) in 7 ms on localhost (executor driver) (1/1) 18/01/22 15:38:58 INFO scheduler.DAGScheduler: ResultStage 8 (collectPartitions at NativeMethodAccessorImpl.java:0) finished in 0.019 s 18/01/22 15:38:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 8.0, whose tasks have all completed, from pool 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Job 6 finished: collectPartitions at NativeMethodAccessorImpl.java:0, took 0.025454 s 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 183 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 130 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 42 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 61 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 101 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Removed broadcast_4_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 959.0 B, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 149 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 248 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 175 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 216 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 62 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 179 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 198 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 140 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 137 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 240 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 128 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 56 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 92 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 147 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 94 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 64 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 85 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 119 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 169 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 88 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 139 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 213 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 246 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 146 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 106 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned shuffle 1 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 66 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 167 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 236 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 249 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 161 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Removed broadcast_8_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 959.0 B, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 53 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 219 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 116 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 221 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 133 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 72 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 37 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 194 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 150 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 159 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 111 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 68 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 181 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 91 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 120 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 168 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 178 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 90 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 75 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 89 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 152 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 176 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 30 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 217 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 112 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 136 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 207 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 114 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 232 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 28 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 145 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 210 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 233 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 47 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 209 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 124 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 109 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 151 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 247 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 156 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 214 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 163 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 123 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 117 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 98 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 186 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Removed broadcast_7_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 6.4 KB, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 226 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 243 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 71 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 220 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 59 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 34 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Removed broadcast_6_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 3.7 KB, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 60 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 185 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 25 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 242 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 58 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 239 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 135 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 153 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 35 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 187 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 195 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 251 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 76 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 160 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 57 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 165 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 48 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 148 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 73 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 102 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 180 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 184 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 225 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 39 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Removed broadcast_5_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 8.4 KB, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 44 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 86 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 126 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 49 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 164 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 141 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 162 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 82 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 171 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 234 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 255 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 196 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 155 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 83 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 229 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 177 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 223 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 69 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 27 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 29 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 40 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 174 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Removed broadcast_3_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 959.0 B, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 143 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 104 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 202 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 245 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 127 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 79 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 230 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 33 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 105 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 144 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 70 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 100 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 191 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 190 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 43 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 8.1 KB, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 200 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 158 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 197 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Removed broadcast_2_piece0 on DESKTOP-BBK0H95:53046 in memory (size: 3.7 KB, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 253 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 134 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 125 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 77 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 110 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 206 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 237 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 203 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 121 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 81 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 204 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 67 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 50 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 254 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 74 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned shuffle 0 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 241 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 63 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 96 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 222 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 227 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 228 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 252 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 157 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 26 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 166 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 182 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 103 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 45 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 65 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 199 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 84 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 138 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 244 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 131 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 201 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 118 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 78 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 211 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 218 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 31 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 51 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 38 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 107 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 32 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 99 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 132 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 93 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 36 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 95 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 154 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 170 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 41 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 172 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 54 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 80 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 97 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 87 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 115 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 108 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 113 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 188 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 231 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 192 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 215 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 55 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 193 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 52 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 235 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 129 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 250 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 208 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 224 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 46 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 173 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 212 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 122 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 142 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 189 18/01/22 15:38:58 INFO spark.ContextCleaner: Cleaned accumulator 238 18/01/22 15:38:58 INFO codegen.CodeGenerator: Code generated in 22.27886 ms 18/01/22 15:38:58 INFO spark.SparkContext: Starting job: dfToCols at NativeMethodAccessorImpl.java:0 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Got job 7 (dfToCols at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Final stage: ResultStage 9 (dfToCols at NativeMethodAccessorImpl.java:0) 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Submitting ResultStage 9 (MapPartitionsRDD[31] at dfToCols at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:38:58 INFO memory.MemoryStore: Block broadcast_9 stored as values in memory (estimated size 17.2 KB, free 366.3 MB) 18/01/22 15:38:58 INFO memory.MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 6.9 KB, free 366.3 MB) 18/01/22 15:38:58 INFO storage.BlockManagerInfo: Added broadcast_9_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 6.9 KB, free: 366.3 MB) 18/01/22 15:38:58 INFO spark.SparkContext: Created broadcast 9 from broadcast at DAGScheduler.scala:1029 18/01/22 15:38:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 9 (MapPartitionsRDD[31] at dfToCols at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:38:58 INFO scheduler.TaskSchedulerImpl: Adding task set 9.0 with 1 tasks 18/01/22 15:38:58 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 9.0 (TID 9, localhost, executor driver, partition 0, PROCESS_LOCAL, 13759 bytes) 18/01/22 15:38:58 INFO executor.Executor: Running task 0.0 in stage 9.0 (TID 9) 18/01/22 15:38:59 INFO codegen.CodeGenerator: Code generated in 24.263849 ms 18/01/22 15:38:59 INFO codegen.CodeGenerator: Code generated in 84.194781 ms 18/01/22 15:38:59 INFO r.RRunner: Times: boot = 0.246 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.150 s, total = 1.136 s 18/01/22 15:38:59 INFO executor.Executor: Finished task 0.0 in stage 9.0 (TID 9). 2422 bytes result sent to driver 18/01/22 15:38:59 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 9.0 (TID 9) in 1165 ms on localhost (executor driver) (1/1) 18/01/22 15:38:59 INFO scheduler.DAGScheduler: ResultStage 9 (dfToCols at NativeMethodAccessorImpl.java:0) finished in 1.179 s 18/01/22 15:38:59 INFO scheduler.DAGScheduler: Job 7 finished: dfToCols at NativeMethodAccessorImpl.java:0, took 1.185211 s 18/01/22 15:38:59 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 9.0, whose tasks have all completed, from pool 18/01/22 15:39:00 INFO spark.SparkContext: Starting job: collectPartitions at NativeMethodAccessorImpl.java:0 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Got job 8 (collectPartitions at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Final stage: ResultStage 10 (collectPartitions at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Submitting ResultStage 10 (ParallelCollectionRDD[32] at parallelize at RRDD.scala:151), which has no missing parents 18/01/22 15:39:00 INFO memory.MemoryStore: Block broadcast_10 stored as values in memory (estimated size 1424.0 B, free 366.3 MB) 18/01/22 15:39:00 INFO memory.MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 959.0 B, free 366.3 MB) 18/01/22 15:39:00 INFO storage.BlockManagerInfo: Added broadcast_10_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 959.0 B, free: 366.3 MB) 18/01/22 15:39:00 INFO spark.SparkContext: Created broadcast 10 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 10 (ParallelCollectionRDD[32] at parallelize at RRDD.scala:151) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:00 INFO scheduler.TaskSchedulerImpl: Adding task set 10.0 with 1 tasks 18/01/22 15:39:00 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 10.0 (TID 10, localhost, executor driver, partition 0, PROCESS_LOCAL, 7890 bytes) 18/01/22 15:39:00 INFO executor.Executor: Running task 0.0 in stage 10.0 (TID 10) 18/01/22 15:39:00 INFO executor.Executor: Finished task 0.0 in stage 10.0 (TID 10). 670 bytes result sent to driver 18/01/22 15:39:00 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 10.0 (TID 10) in 10 ms on localhost (executor driver) (1/1) 18/01/22 15:39:00 INFO scheduler.DAGScheduler: ResultStage 10 (collectPartitions at NativeMethodAccessorImpl.java:0) finished in 0.022 s 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Job 8 finished: collectPartitions at NativeMethodAccessorImpl.java:0, took 0.029095 s 18/01/22 15:39:00 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 10.0, whose tasks have all completed, from pool 18/01/22 15:39:00 INFO codegen.CodeGenerator: Code generated in 11.515369 ms 18/01/22 15:39:00 INFO spark.SparkContext: Starting job: dfToCols at NativeMethodAccessorImpl.java:0 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Got job 9 (dfToCols at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Final stage: ResultStage 11 (dfToCols at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Submitting ResultStage 11 (MapPartitionsRDD[37] at dfToCols at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:39:00 INFO memory.MemoryStore: Block broadcast_11 stored as values in memory (estimated size 13.6 KB, free 366.3 MB) 18/01/22 15:39:00 INFO memory.MemoryStore: Block broadcast_11_piece0 stored as bytes in memory (estimated size 6.1 KB, free 366.3 MB) 18/01/22 15:39:00 INFO storage.BlockManagerInfo: Added broadcast_11_piece0 in memory on DESKTOP-BBK0H95:53046 (size: 6.1 KB, free: 366.3 MB) 18/01/22 15:39:00 INFO spark.SparkContext: Created broadcast 11 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:00 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 11 (MapPartitionsRDD[37] at dfToCols at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:00 INFO scheduler.TaskSchedulerImpl: Adding task set 11.0 with 1 tasks 18/01/22 15:39:00 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 11.0 (TID 11, localhost, executor driver, partition 0, PROCESS_LOCAL, 7890 bytes) 18/01/22 15:39:00 INFO executor.Executor: Running task 0.0 in stage 11.0 (TID 11) 18/01/22 15:39:01 INFO codegen.CodeGenerator: Code generated in 12.500641 ms 18/01/22 15:39:01 INFO r.RRunner: Times: boot = 0.282 s, init = 0.750 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.000 s, total = 1.042 s 18/01/22 15:39:01 INFO codegen.CodeGenerator: Code generated in 21.011919 ms 18/01/22 15:39:01 INFO executor.Executor: Finished task 0.0 in stage 11.0 (TID 11). 1164 bytes result sent to driver 18/01/22 15:39:01 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 11.0 (TID 11) in 1097 ms on localhost (executor driver) (1/1) 18/01/22 15:39:01 INFO scheduler.DAGScheduler: ResultStage 11 (dfToCols at NativeMethodAccessorImpl.java:0) finished in 1.114 s 18/01/22 15:39:01 INFO scheduler.DAGScheduler: Job 9 finished: dfToCols at NativeMethodAccessorImpl.java:0, took 1.119243 s 18/01/22 15:39:01 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 11.0, whose tasks have all completed, from pool 18/01/22 15:39:01 INFO server.AbstractConnector: Stopped Spark@7c9936a0{HTTP/1.1,[http/1.1]}{0.0.0.0:4041} 18/01/22 15:39:01 INFO ui.SparkUI: Stopped Spark web UI at http://DESKTOP-BBK0H95:4041 18/01/22 15:39:01 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 18/01/22 15:39:01 INFO memory.MemoryStore: MemoryStore cleared 18/01/22 15:39:01 INFO storage.BlockManager: BlockManager stopped 18/01/22 15:39:01 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 18/01/22 15:39:01 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 18/01/22 15:39:01 INFO spark.SparkContext: Successfully stopped SparkContext 18/01/22 15:39:01 INFO spark.SparkContext: Running Spark version 2.3.0-SNAPSHOT 18/01/22 15:39:01 INFO spark.SparkContext: Submitted application: SparkR 18/01/22 15:39:01 INFO spark.SecurityManager: Changing view acls to: nam23 18/01/22 15:39:01 INFO spark.SecurityManager: Changing modify acls to: nam23 18/01/22 15:39:01 INFO spark.SecurityManager: Changing view acls groups to: 18/01/22 15:39:01 INFO spark.SecurityManager: Changing modify acls groups to: 18/01/22 15:39:01 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(nam23); groups with view permissions: Set(); users with modify permissions: Set(nam23); groups with modify permissions: Set() 18/01/22 15:39:01 INFO util.Utils: Successfully started service 'sparkDriver' on port 53081. 18/01/22 15:39:01 INFO spark.SparkEnv: Registering MapOutputTracker 18/01/22 15:39:01 INFO spark.SparkEnv: Registering BlockManagerMaster 18/01/22 15:39:01 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 18/01/22 15:39:01 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 18/01/22 15:39:01 INFO storage.DiskBlockManager: Created local directory at C:\Users\nam23\AppData\Local\Temp\blockmgr-7008b32c-04b8-446f-9494-3a5bb55f6d75 18/01/22 15:39:01 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB 18/01/22 15:39:01 INFO spark.SparkEnv: Registering OutputCommitCoordinator 18/01/22 15:39:01 INFO server.Server: jetty-9.3.z-SNAPSHOT 18/01/22 15:39:01 INFO server.Server: Started @18008ms 18/01/22 15:39:01 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 18/01/22 15:39:01 INFO server.AbstractConnector: Started ServerConnector@560332bf{HTTP/1.1,[http/1.1]}{0.0.0.0:4041} 18/01/22 15:39:01 INFO util.Utils: Successfully started service 'SparkUI' on port 4041. 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@668030de{/jobs,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@dbc7135{/jobs/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51c9884{/jobs/job,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4ac5c452{/jobs/job/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2d240b14{/stages,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@43fff25c{/stages/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22c69a1f{/stages/stage,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@46453d41{/stages/stage/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4bb4f236{/stages/pool,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7765840e{/stages/pool/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61e5ee9f{/storage,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@ea578d7{/storage/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@626c0e77{/storage/rdd,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@39f2b3c0{/storage/rdd/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@772d2377{/environment,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@52eb9e85{/environment/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50f63d19{/executors,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7cb6e874{/executors/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@621e4a0e{/executors/threadDump,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6c0fff7e{/executors/threadDump/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df00913{/static,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@46ceb591{/,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5690d97c{/api,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@48c219e3{/jobs/job/kill,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b135f57{/stages/stage/kill,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DESKTOP-BBK0H95:4041 18/01/22 15:39:01 INFO spark.SparkContext: Added file file:/C:/programs/spark-dev/R/pkg/tests/run-all.R at file:/C:/programs/spark-dev/R/pkg/tests/run-all.R with timestamp 1516653541659 18/01/22 15:39:01 INFO util.Utils: Copying C:\programs\spark-dev\R\pkg\tests\run-all.R to C:\Users\nam23\AppData\Local\Temp\spark-6e66f881-4d3e-406e-ac32-346ed40bd635\userFiles-b9efabb6-01e9-4ff8-9ab0-52b889d334b3\run-all.R 18/01/22 15:39:01 INFO executor.Executor: Starting executor ID driver on host localhost 18/01/22 15:39:01 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 53090. 18/01/22 15:39:01 INFO netty.NettyBlockTransferService: Server created on DESKTOP-BBK0H95:53090 18/01/22 15:39:01 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 18/01/22 15:39:01 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, DESKTOP-BBK0H95, 53090, None) 18/01/22 15:39:01 INFO storage.BlockManagerMasterEndpoint: Registering block manager DESKTOP-BBK0H95:53090 with 366.3 MB RAM, BlockManagerId(driver, DESKTOP-BBK0H95, 53090, None) 18/01/22 15:39:01 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, DESKTOP-BBK0H95, 53090, None) 18/01/22 15:39:01 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, DESKTOP-BBK0H95, 53090, None) 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2e496ece{/metrics/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO internal.SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/C:/programs/spark-dev/spark-warehouse/'). 18/01/22 15:39:01 INFO internal.SharedState: Warehouse path is 'file:/C:/programs/spark-dev/spark-warehouse/'. 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4fa8f0b8{/SQL,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5516e5a2{/SQL/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bb60894{/SQL/execution,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@23a944be{/SQL/execution/json,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@38e8b9ab{/static/sql,null,AVAILABLE,@Spark} 18/01/22 15:39:01 INFO state.StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 18/01/22 15:39:01 INFO spark.SparkContext: Starting job: collectPartitions at NativeMethodAccessorImpl.java:0 18/01/22 15:39:01 INFO scheduler.DAGScheduler: Got job 0 (collectPartitions at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:01 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (collectPartitions at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:01 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:01 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:01 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at RRDD.scala:151), which has no missing parents 18/01/22 15:39:01 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1424.0 B, free 366.3 MB) 18/01/22 15:39:01 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 961.0 B, free 366.3 MB) 18/01/22 15:39:01 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 961.0 B, free: 366.3 MB) 18/01/22 15:39:01 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:01 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at parallelize at RRDD.scala:151) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:01 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks 18/01/22 15:39:01 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:01 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0) 18/01/22 15:39:01 INFO executor.Executor: Fetching file:/C:/programs/spark-dev/R/pkg/tests/run-all.R with timestamp 1516653541659 18/01/22 15:39:01 INFO util.Utils: C:\programs\spark-dev\R\pkg\tests\run-all.R has been previously copied to C:\Users\nam23\AppData\Local\Temp\spark-6e66f881-4d3e-406e-ac32-346ed40bd635\userFiles-b9efabb6-01e9-4ff8-9ab0-52b889d334b3\run-all.R 18/01/22 15:39:02 INFO executor.Executor: Finished task 0.0 in stage 0.0 (TID 0). 15214 bytes result sent to driver 18/01/22 15:39:02 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 167 ms on localhost (executor driver) (1/1) 18/01/22 15:39:02 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 18/01/22 15:39:02 INFO scheduler.DAGScheduler: ResultStage 0 (collectPartitions at NativeMethodAccessorImpl.java:0) finished in 0.179 s 18/01/22 15:39:02 INFO scheduler.DAGScheduler: Job 0 finished: collectPartitions at NativeMethodAccessorImpl.java:0, took 0.186815 s 18/01/22 15:39:02 INFO codegen.CodeGenerator: Code generated in 12.897107 ms 18/01/22 15:39:02 INFO spark.SparkContext: Starting job: countByValue at StringIndexer.scala:140 18/01/22 15:39:02 INFO scheduler.DAGScheduler: Registering RDD 10 (countByValue at StringIndexer.scala:140) 18/01/22 15:39:02 INFO scheduler.DAGScheduler: Got job 1 (countByValue at StringIndexer.scala:140) with 1 output partitions 18/01/22 15:39:02 INFO scheduler.DAGScheduler: Final stage: ResultStage 2 (countByValue at StringIndexer.scala:140) 18/01/22 15:39:02 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 1) 18/01/22 15:39:02 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 1) 18/01/22 15:39:02 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 1 (MapPartitionsRDD[10] at countByValue at StringIndexer.scala:140), which has no missing parents 18/01/22 15:39:02 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 22.2 KB, free 366.3 MB) 18/01/22 15:39:02 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 9.6 KB, free 366.3 MB) 18/01/22 15:39:02 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 9.6 KB, free: 366.3 MB) 18/01/22 15:39:02 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:02 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 1 (MapPartitionsRDD[10] at countByValue at StringIndexer.scala:140) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:02 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks 18/01/22 15:39:02 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:02 INFO executor.Executor: Running task 0.0 in stage 1.0 (TID 1) 18/01/22 15:39:03 INFO codegen.CodeGenerator: Code generated in 16.965156 ms 18/01/22 15:39:03 INFO codegen.CodeGenerator: Code generated in 11.576949 ms 18/01/22 15:39:04 INFO codegen.CodeGenerator: Code generated in 65.055336 ms 18/01/22 15:39:04 INFO r.RRunner: Times: boot = 0.364 s, init = 0.920 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.120 s, total = 1.424 s 18/01/22 15:39:04 INFO executor.Executor: Finished task 0.0 in stage 1.0 (TID 1). 1445 bytes result sent to driver 18/01/22 15:39:04 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 1514 ms on localhost (executor driver) (1/1) 18/01/22 15:39:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 18/01/22 15:39:04 INFO scheduler.DAGScheduler: ShuffleMapStage 1 (countByValue at StringIndexer.scala:140) finished in 1.545 s 18/01/22 15:39:04 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:04 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:04 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 2) 18/01/22 15:39:04 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:04 INFO scheduler.DAGScheduler: Submitting ResultStage 2 (ShuffledRDD[11] at countByValue at StringIndexer.scala:140), which has no missing parents 18/01/22 15:39:04 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 3.2 KB, free 366.3 MB) 18/01/22 15:39:04 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1971.0 B, free 366.3 MB) 18/01/22 15:39:04 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 1971.0 B, free: 366.3 MB) 18/01/22 15:39:04 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:04 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (ShuffledRDD[11] at countByValue at StringIndexer.scala:140) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:04 INFO scheduler.TaskSchedulerImpl: Adding task set 2.0 with 1 tasks 18/01/22 15:39:04 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, executor driver, partition 0, ANY, 7649 bytes) 18/01/22 15:39:04 INFO executor.Executor: Running task 0.0 in stage 2.0 (TID 2) 18/01/22 15:39:04 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:04 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 3 ms 18/01/22 15:39:04 INFO executor.Executor: Finished task 0.0 in stage 2.0 (TID 2). 1290 bytes result sent to driver 18/01/22 15:39:04 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 31 ms on localhost (executor driver) (1/1) 18/01/22 15:39:04 INFO scheduler.DAGScheduler: ResultStage 2 (countByValue at StringIndexer.scala:140) finished in 0.042 s 18/01/22 15:39:04 INFO scheduler.DAGScheduler: Job 1 finished: countByValue at StringIndexer.scala:140, took 1.793389 s 18/01/22 15:39:04 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 363 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 341 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 345 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 378 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 382 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 366 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 348 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 352 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 333 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 389 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 365 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 349 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 361 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 335 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 355 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 354 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 392 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 340 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 381 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 385 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 388 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 396 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 343 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 379 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 384 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 397 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 390 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned shuffle 0 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 400 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 402 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 371 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 364 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 358 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 372 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 399 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 367 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 406 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 376 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 393 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 353 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 383 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 409 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 351 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 336 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 369 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 346 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 387 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 374 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 404 18/01/22 15:39:04 INFO storage.BlockManagerInfo: Removed broadcast_0_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 961.0 B, free: 366.3 MB) 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 359 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 334 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 368 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 401 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 342 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 350 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 403 18/01/22 15:39:04 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 9.6 KB, free: 366.3 MB) 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 375 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 405 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 373 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 410 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 337 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 356 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 357 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 408 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 398 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 380 18/01/22 15:39:04 INFO storage.BlockManagerInfo: Removed broadcast_2_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 1971.0 B, free: 366.3 MB) 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 338 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 386 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 377 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 347 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 360 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 391 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 407 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 344 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 339 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 394 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 370 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 362 18/01/22 15:39:04 INFO spark.ContextCleaner: Cleaned accumulator 395 18/01/22 15:39:05 INFO codegen.CodeGenerator: Code generated in 9.628831 ms 18/01/22 15:39:05 INFO codegen.CodeGenerator: Code generated in 64.480214 ms 18/01/22 15:39:05 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:379 18/01/22 15:39:05 INFO scheduler.DAGScheduler: Got job 2 (first at GeneralizedLinearRegression.scala:379) with 1 output partitions 18/01/22 15:39:05 INFO scheduler.DAGScheduler: Final stage: ResultStage 3 (first at GeneralizedLinearRegression.scala:379) 18/01/22 15:39:05 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:05 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:05 INFO scheduler.DAGScheduler: Submitting ResultStage 3 (MapPartitionsRDD[15] at first at GeneralizedLinearRegression.scala:379), which has no missing parents 18/01/22 15:39:05 INFO memory.MemoryStore: Block broadcast_3 stored as values in memory (estimated size 40.2 KB, free 366.3 MB) 18/01/22 15:39:05 INFO memory.MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 14.9 KB, free 366.2 MB) 18/01/22 15:39:05 INFO storage.BlockManagerInfo: Added broadcast_3_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 14.9 KB, free: 366.3 MB) 18/01/22 15:39:05 INFO spark.SparkContext: Created broadcast 3 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:05 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 3 (MapPartitionsRDD[15] at first at GeneralizedLinearRegression.scala:379) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:05 INFO scheduler.TaskSchedulerImpl: Adding task set 3.0 with 1 tasks 18/01/22 15:39:05 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 3.0 (TID 3, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:05 INFO executor.Executor: Running task 0.0 in stage 3.0 (TID 3) 18/01/22 15:39:06 INFO executor.Executor: Finished task 0.0 in stage 3.0 (TID 3). 1139 bytes result sent to driver 18/01/22 15:39:06 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 3.0 (TID 3) in 1047 ms on localhost (executor driver) (1/1) 18/01/22 15:39:06 INFO scheduler.DAGScheduler: ResultStage 3 (first at GeneralizedLinearRegression.scala:379) finished in 1.074 s 18/01/22 15:39:06 INFO scheduler.DAGScheduler: Job 2 finished: first at GeneralizedLinearRegression.scala:379, took 1.094540 s 18/01/22 15:39:06 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool 18/01/22 15:39:06 INFO codegen.CodeGenerator: Code generated in 37.776371 ms 18/01/22 15:39:06 INFO util.Instrumentation: GeneralizedLinearRegression-glm_a84ec7eb7373-744342938-1: training: numPartitions=1 storageLevel=StorageLevel(1 replicas) 18/01/22 15:39:06 INFO util.Instrumentation: GeneralizedLinearRegression-glm_a84ec7eb7373-744342938-1: {"featuresCol":"features","fitIntercept":true,"link":"identity","regParam":0.0,"tol":1.0E-6,"family":"gaussian","maxIter":25} 18/01/22 15:39:06 INFO util.Instrumentation: GeneralizedLinearRegression-glm_a84ec7eb7373-744342938-1: {"numFeatures":3} 18/01/22 15:39:06 INFO codegen.CodeGenerator: Code generated in 33.317938 ms 18/01/22 15:39:06 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:06 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:06 INFO scheduler.DAGScheduler: Got job 3 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:06 INFO scheduler.DAGScheduler: Final stage: ResultStage 4 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:06 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:06 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:06 INFO scheduler.DAGScheduler: Submitting ResultStage 4 (MapPartitionsRDD[25] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:06 INFO memory.MemoryStore: Block broadcast_4 stored as values in memory (estimated size 42.0 KB, free 366.2 MB) 18/01/22 15:39:06 INFO memory.MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 16.8 KB, free 366.2 MB) 18/01/22 15:39:06 INFO storage.BlockManagerInfo: Added broadcast_4_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 16.8 KB, free: 366.3 MB) 18/01/22 15:39:06 INFO spark.SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:06 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 4 (MapPartitionsRDD[25] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:06 INFO scheduler.TaskSchedulerImpl: Adding task set 4.0 with 1 tasks 18/01/22 15:39:06 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 4.0 (TID 4, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:06 INFO executor.Executor: Running task 0.0 in stage 4.0 (TID 4) 18/01/22 15:39:07 INFO codegen.CodeGenerator: Code generated in 11.406275 ms 18/01/22 15:39:07 WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS 18/01/22 15:39:07 WARN netlib.BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS 18/01/22 15:39:07 INFO r.RRunner: Times: boot = 0.267 s, init = 0.750 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.140 s, total = 1.167 s 18/01/22 15:39:07 INFO executor.Executor: Finished task 0.0 in stage 4.0 (TID 4). 1509 bytes result sent to driver 18/01/22 15:39:07 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 4.0 (TID 4) in 1190 ms on localhost (executor driver) (1/1) 18/01/22 15:39:07 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool 18/01/22 15:39:07 INFO scheduler.DAGScheduler: ResultStage 4 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.203 s 18/01/22 15:39:07 INFO scheduler.DAGScheduler: Job 3 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.209319 s 18/01/22 15:39:07 INFO optim.WeightedLeastSquares: Number of instances: 150. 18/01/22 15:39:07 WARN netlib.LAPACK: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK 18/01/22 15:39:07 WARN netlib.LAPACK: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK 18/01/22 15:39:08 INFO util.Instrumentation: GeneralizedLinearRegression-glm_a84ec7eb7373-744342938-1: training finished 18/01/22 15:39:08 INFO codegen.CodeGenerator: Code generated in 7.1303 ms 18/01/22 15:39:08 INFO codegen.CodeGenerator: Code generated in 15.18049 ms 18/01/22 15:39:08 INFO codegen.CodeGenerator: Code generated in 32.75498 ms 18/01/22 15:39:08 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:1366 18/01/22 15:39:08 INFO scheduler.DAGScheduler: Registering RDD 28 (first at GeneralizedLinearRegression.scala:1366) 18/01/22 15:39:08 INFO scheduler.DAGScheduler: Got job 4 (first at GeneralizedLinearRegression.scala:1366) with 1 output partitions 18/01/22 15:39:08 INFO scheduler.DAGScheduler: Final stage: ResultStage 6 (first at GeneralizedLinearRegression.scala:1366) 18/01/22 15:39:08 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 5) 18/01/22 15:39:08 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 5) 18/01/22 15:39:08 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 5 (MapPartitionsRDD[28] at first at GeneralizedLinearRegression.scala:1366), which has no missing parents 18/01/22 15:39:08 INFO memory.MemoryStore: Block broadcast_5 stored as values in memory (estimated size 67.2 KB, free 366.1 MB) 18/01/22 15:39:08 INFO memory.MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 26.5 KB, free 366.1 MB) 18/01/22 15:39:08 INFO storage.BlockManagerInfo: Added broadcast_5_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 26.5 KB, free: 366.2 MB) 18/01/22 15:39:08 INFO spark.SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:08 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 5 (MapPartitionsRDD[28] at first at GeneralizedLinearRegression.scala:1366) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:08 INFO scheduler.TaskSchedulerImpl: Adding task set 5.0 with 1 tasks 18/01/22 15:39:08 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 5.0 (TID 5, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:08 INFO executor.Executor: Running task 0.0 in stage 5.0 (TID 5) 18/01/22 15:39:09 INFO r.RRunner: Times: boot = 0.295 s, init = 1.060 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.140 s, total = 1.515 s 18/01/22 15:39:09 INFO executor.Executor: Finished task 0.0 in stage 5.0 (TID 5). 1700 bytes result sent to driver 18/01/22 15:39:09 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 5.0 (TID 5) in 1543 ms on localhost (executor driver) (1/1) 18/01/22 15:39:09 INFO scheduler.DAGScheduler: ShuffleMapStage 5 (first at GeneralizedLinearRegression.scala:1366) finished in 1.561 s 18/01/22 15:39:09 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:09 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:09 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 6) 18/01/22 15:39:09 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:09 INFO scheduler.DAGScheduler: Submitting ResultStage 6 (MapPartitionsRDD[32] at first at GeneralizedLinearRegression.scala:1366), which has no missing parents 18/01/22 15:39:09 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 18/01/22 15:39:09 INFO memory.MemoryStore: Block broadcast_6 stored as values in memory (estimated size 8.6 KB, free 366.1 MB) 18/01/22 15:39:09 INFO memory.MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 4.3 KB, free 366.1 MB) 18/01/22 15:39:09 INFO storage.BlockManagerInfo: Added broadcast_6_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 4.3 KB, free: 366.2 MB) 18/01/22 15:39:09 INFO spark.SparkContext: Created broadcast 6 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:09 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 6 (MapPartitionsRDD[32] at first at GeneralizedLinearRegression.scala:1366) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:09 INFO scheduler.TaskSchedulerImpl: Adding task set 6.0 with 1 tasks 18/01/22 15:39:09 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 6.0 (TID 6, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:09 INFO executor.Executor: Running task 0.0 in stage 6.0 (TID 6) 18/01/22 15:39:09 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:09 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 2 ms 18/01/22 15:39:09 INFO executor.Executor: Finished task 0.0 in stage 6.0 (TID 6). 1557 bytes result sent to driver 18/01/22 15:39:09 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 6.0 (TID 6) in 16 ms on localhost (executor driver) (1/1) 18/01/22 15:39:09 INFO scheduler.DAGScheduler: ResultStage 6 (first at GeneralizedLinearRegression.scala:1366) finished in 0.034 s 18/01/22 15:39:09 INFO scheduler.DAGScheduler: Job 4 finished: first at GeneralizedLinearRegression.scala:1366, took 1.624648 s 18/01/22 15:39:09 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool 18/01/22 15:39:10 INFO spark.SparkContext: Starting job: count at GeneralizedLinearRegression.scala:1207 18/01/22 15:39:10 INFO scheduler.DAGScheduler: Registering RDD 35 (count at GeneralizedLinearRegression.scala:1207) 18/01/22 15:39:10 INFO scheduler.DAGScheduler: Got job 5 (count at GeneralizedLinearRegression.scala:1207) with 1 output partitions 18/01/22 15:39:10 INFO scheduler.DAGScheduler: Final stage: ResultStage 8 (count at GeneralizedLinearRegression.scala:1207) 18/01/22 15:39:10 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 7) 18/01/22 15:39:10 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 7) 18/01/22 15:39:10 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 7 (MapPartitionsRDD[35] at count at GeneralizedLinearRegression.scala:1207), which has no missing parents 18/01/22 15:39:10 INFO memory.MemoryStore: Block broadcast_7 stored as values in memory (estimated size 20.0 KB, free 366.1 MB) 18/01/22 15:39:10 INFO memory.MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 8.7 KB, free 366.1 MB) 18/01/22 15:39:10 INFO storage.BlockManagerInfo: Added broadcast_7_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 8.7 KB, free: 366.2 MB) 18/01/22 15:39:10 INFO spark.SparkContext: Created broadcast 7 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:10 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 7 (MapPartitionsRDD[35] at count at GeneralizedLinearRegression.scala:1207) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:10 INFO scheduler.TaskSchedulerImpl: Adding task set 7.0 with 1 tasks 18/01/22 15:39:10 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 7.0 (TID 7, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:10 INFO executor.Executor: Running task 0.0 in stage 7.0 (TID 7) 18/01/22 15:39:11 INFO r.RRunner: Times: boot = 0.279 s, init = 0.880 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.100 s, total = 1.269 s 18/01/22 15:39:11 INFO executor.Executor: Finished task 0.0 in stage 7.0 (TID 7). 1462 bytes result sent to driver 18/01/22 15:39:11 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 7.0 (TID 7) in 1299 ms on localhost (executor driver) (1/1) 18/01/22 15:39:11 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 7.0, whose tasks have all completed, from pool 18/01/22 15:39:11 INFO scheduler.DAGScheduler: ShuffleMapStage 7 (count at GeneralizedLinearRegression.scala:1207) finished in 1.313 s 18/01/22 15:39:11 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:11 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:11 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 8) 18/01/22 15:39:11 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Submitting ResultStage 8 (MapPartitionsRDD[38] at count at GeneralizedLinearRegression.scala:1207), which has no missing parents 18/01/22 15:39:11 INFO memory.MemoryStore: Block broadcast_8 stored as values in memory (estimated size 7.1 KB, free 366.0 MB) 18/01/22 15:39:11 INFO memory.MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 3.7 KB, free 366.0 MB) 18/01/22 15:39:11 INFO storage.BlockManagerInfo: Added broadcast_8_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 3.7 KB, free: 366.2 MB) 18/01/22 15:39:11 INFO spark.SparkContext: Created broadcast 8 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 8 (MapPartitionsRDD[38] at count at GeneralizedLinearRegression.scala:1207) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:11 INFO scheduler.TaskSchedulerImpl: Adding task set 8.0 with 1 tasks 18/01/22 15:39:11 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 8.0 (TID 8, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:11 INFO executor.Executor: Running task 0.0 in stage 8.0 (TID 8) 18/01/22 15:39:11 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:11 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 2 ms 18/01/22 15:39:11 INFO executor.Executor: Finished task 0.0 in stage 8.0 (TID 8). 1696 bytes result sent to driver 18/01/22 15:39:11 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 8.0 (TID 8) in 11 ms on localhost (executor driver) (1/1) 18/01/22 15:39:11 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 8.0, whose tasks have all completed, from pool 18/01/22 15:39:11 INFO scheduler.DAGScheduler: ResultStage 8 (count at GeneralizedLinearRegression.scala:1207) finished in 0.021 s 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Job 5 finished: count at GeneralizedLinearRegression.scala:1207, took 1.351612 s 18/01/22 15:39:11 INFO codegen.CodeGenerator: Code generated in 9.043446 ms 18/01/22 15:39:11 INFO codegen.CodeGenerator: Code generated in 17.086795 ms 18/01/22 15:39:11 INFO codegen.CodeGenerator: Code generated in 15.258415 ms 18/01/22 15:39:11 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:1321 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Registering RDD 41 (first at GeneralizedLinearRegression.scala:1321) 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Got job 6 (first at GeneralizedLinearRegression.scala:1321) with 1 output partitions 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Final stage: ResultStage 10 (first at GeneralizedLinearRegression.scala:1321) 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 9) 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 9) 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 9 (MapPartitionsRDD[41] at first at GeneralizedLinearRegression.scala:1321), which has no missing parents 18/01/22 15:39:11 INFO memory.MemoryStore: Block broadcast_9 stored as values in memory (estimated size 22.4 KB, free 366.0 MB) 18/01/22 15:39:11 INFO memory.MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 9.5 KB, free 366.0 MB) 18/01/22 15:39:11 INFO storage.BlockManagerInfo: Added broadcast_9_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 9.5 KB, free: 366.2 MB) 18/01/22 15:39:11 INFO spark.SparkContext: Created broadcast 9 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:11 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 9 (MapPartitionsRDD[41] at first at GeneralizedLinearRegression.scala:1321) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:11 INFO scheduler.TaskSchedulerImpl: Adding task set 9.0 with 1 tasks 18/01/22 15:39:11 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 9.0 (TID 9, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:11 INFO executor.Executor: Running task 0.0 in stage 9.0 (TID 9) 18/01/22 15:39:12 INFO r.RRunner: Times: boot = 0.243 s, init = 0.750 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.060 s, total = 1.073 s 18/01/22 15:39:12 INFO executor.Executor: Finished task 0.0 in stage 9.0 (TID 9). 1505 bytes result sent to driver 18/01/22 15:39:12 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 9.0 (TID 9) in 1102 ms on localhost (executor driver) (1/1) 18/01/22 15:39:12 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 9.0, whose tasks have all completed, from pool 18/01/22 15:39:12 INFO scheduler.DAGScheduler: ShuffleMapStage 9 (first at GeneralizedLinearRegression.scala:1321) finished in 1.114 s 18/01/22 15:39:12 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:12 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:12 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 10) 18/01/22 15:39:12 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Submitting ResultStage 10 (MapPartitionsRDD[45] at first at GeneralizedLinearRegression.scala:1321), which has no missing parents 18/01/22 15:39:12 INFO memory.MemoryStore: Block broadcast_10 stored as values in memory (estimated size 9.9 KB, free 366.0 MB) 18/01/22 15:39:12 INFO memory.MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 4.7 KB, free 366.0 MB) 18/01/22 15:39:12 INFO storage.BlockManagerInfo: Added broadcast_10_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 4.7 KB, free: 366.2 MB) 18/01/22 15:39:12 INFO spark.SparkContext: Created broadcast 10 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 10 (MapPartitionsRDD[45] at first at GeneralizedLinearRegression.scala:1321) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:12 INFO scheduler.TaskSchedulerImpl: Adding task set 10.0 with 1 tasks 18/01/22 15:39:12 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 10.0 (TID 10, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:12 INFO executor.Executor: Running task 0.0 in stage 10.0 (TID 10) 18/01/22 15:39:12 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:12 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms 18/01/22 15:39:12 INFO executor.Executor: Finished task 0.0 in stage 10.0 (TID 10). 1523 bytes result sent to driver 18/01/22 15:39:12 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 10.0 (TID 10) in 10 ms on localhost (executor driver) (1/1) 18/01/22 15:39:12 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 10.0, whose tasks have all completed, from pool 18/01/22 15:39:12 INFO scheduler.DAGScheduler: ResultStage 10 (first at GeneralizedLinearRegression.scala:1321) finished in 0.020 s 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Job 6 finished: first at GeneralizedLinearRegression.scala:1321, took 1.154790 s 18/01/22 15:39:12 INFO codegen.CodeGenerator: Code generated in 9.621989 ms 18/01/22 15:39:12 INFO spark.SparkContext: Starting job: sum at GeneralizedLinearRegression.scala:1340 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Got job 7 (sum at GeneralizedLinearRegression.scala:1340) with 1 output partitions 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Final stage: ResultStage 11 (sum at GeneralizedLinearRegression.scala:1340) 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Submitting ResultStage 11 (MapPartitionsRDD[50] at map at GeneralizedLinearRegression.scala:1337), which has no missing parents 18/01/22 15:39:12 INFO memory.MemoryStore: Block broadcast_11 stored as values in memory (estimated size 55.4 KB, free 365.9 MB) 18/01/22 15:39:12 INFO memory.MemoryStore: Block broadcast_11_piece0 stored as bytes in memory (estimated size 22.5 KB, free 365.9 MB) 18/01/22 15:39:12 INFO storage.BlockManagerInfo: Added broadcast_11_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 22.5 KB, free: 366.2 MB) 18/01/22 15:39:12 INFO spark.SparkContext: Created broadcast 11 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:12 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 11 (MapPartitionsRDD[50] at map at GeneralizedLinearRegression.scala:1337) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:12 INFO scheduler.TaskSchedulerImpl: Adding task set 11.0 with 1 tasks 18/01/22 15:39:12 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 11.0 (TID 11, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:12 INFO executor.Executor: Running task 0.0 in stage 11.0 (TID 11) 18/01/22 15:39:13 INFO codegen.CodeGenerator: Code generated in 7.043252 ms 18/01/22 15:39:13 INFO r.RRunner: Times: boot = 0.244 s, init = 0.740 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.100 s, total = 1.094 s 18/01/22 15:39:13 INFO executor.Executor: Finished task 0.0 in stage 11.0 (TID 11). 1256 bytes result sent to driver 18/01/22 15:39:13 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 11.0 (TID 11) in 1097 ms on localhost (executor driver) (1/1) 18/01/22 15:39:13 INFO scheduler.DAGScheduler: ResultStage 11 (sum at GeneralizedLinearRegression.scala:1340) finished in 1.109 s 18/01/22 15:39:13 INFO scheduler.DAGScheduler: Job 7 finished: sum at GeneralizedLinearRegression.scala:1340, took 1.118438 s 18/01/22 15:39:13 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 11.0, whose tasks have all completed, from pool 18/01/22 15:39:14 INFO codegen.CodeGenerator: Code generated in 26.213107 ms 18/01/22 15:39:14 INFO spark.SparkContext: Starting job: sum at GeneralizedLinearRegression.scala:1351 18/01/22 15:39:14 INFO scheduler.DAGScheduler: Got job 8 (sum at GeneralizedLinearRegression.scala:1351) with 1 output partitions 18/01/22 15:39:14 INFO scheduler.DAGScheduler: Final stage: ResultStage 12 (sum at GeneralizedLinearRegression.scala:1351) 18/01/22 15:39:14 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:14 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:14 INFO scheduler.DAGScheduler: Submitting ResultStage 12 (MapPartitionsRDD[55] at map at GeneralizedLinearRegression.scala:1348), which has no missing parents 18/01/22 15:39:14 INFO memory.MemoryStore: Block broadcast_12 stored as values in memory (estimated size 64.8 KB, free 365.9 MB) 18/01/22 15:39:14 INFO memory.MemoryStore: Block broadcast_12_piece0 stored as bytes in memory (estimated size 25.3 KB, free 365.8 MB) 18/01/22 15:39:14 INFO storage.BlockManagerInfo: Added broadcast_12_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 25.3 KB, free: 366.2 MB) 18/01/22 15:39:14 INFO spark.SparkContext: Created broadcast 12 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:14 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 12 (MapPartitionsRDD[55] at map at GeneralizedLinearRegression.scala:1348) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:14 INFO scheduler.TaskSchedulerImpl: Adding task set 12.0 with 1 tasks 18/01/22 15:39:14 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 12.0 (TID 12, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:14 INFO executor.Executor: Running task 0.0 in stage 12.0 (TID 12) 18/01/22 15:39:15 INFO r.RRunner: Times: boot = 0.265 s, init = 0.800 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.080 s, total = 1.155 s 18/01/22 15:39:15 INFO executor.Executor: Finished task 0.0 in stage 12.0 (TID 12). 1256 bytes result sent to driver 18/01/22 15:39:15 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 12.0 (TID 12) in 1166 ms on localhost (executor driver) (1/1) 18/01/22 15:39:15 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 12.0, whose tasks have all completed, from pool 18/01/22 15:39:15 INFO scheduler.DAGScheduler: ResultStage 12 (sum at GeneralizedLinearRegression.scala:1351) finished in 1.184 s 18/01/22 15:39:15 INFO scheduler.DAGScheduler: Job 8 finished: sum at GeneralizedLinearRegression.scala:1351, took 1.194462 s 18/01/22 15:39:15 INFO codegen.CodeGenerator: Code generated in 14.254137 ms 18/01/22 15:39:15 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:1373 18/01/22 15:39:15 INFO scheduler.DAGScheduler: Registering RDD 58 (first at GeneralizedLinearRegression.scala:1373) 18/01/22 15:39:15 INFO scheduler.DAGScheduler: Got job 9 (first at GeneralizedLinearRegression.scala:1373) with 1 output partitions 18/01/22 15:39:15 INFO scheduler.DAGScheduler: Final stage: ResultStage 14 (first at GeneralizedLinearRegression.scala:1373) 18/01/22 15:39:15 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 13) 18/01/22 15:39:15 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 13) 18/01/22 15:39:15 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 13 (MapPartitionsRDD[58] at first at GeneralizedLinearRegression.scala:1373), which has no missing parents 18/01/22 15:39:15 INFO memory.MemoryStore: Block broadcast_13 stored as values in memory (estimated size 20.4 KB, free 365.8 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 716 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 648 18/01/22 15:39:15 INFO memory.MemoryStore: Block broadcast_13_piece0 stored as bytes in memory (estimated size 8.9 KB, free 365.8 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 546 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 589 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 670 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 710 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Added broadcast_13_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 8.9 KB, free: 366.2 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 665 18/01/22 15:39:15 INFO spark.SparkContext: Created broadcast 13 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 469 18/01/22 15:39:15 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 13 (MapPartitionsRDD[58] at first at GeneralizedLinearRegression.scala:1373) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 596 18/01/22 15:39:15 INFO scheduler.TaskSchedulerImpl: Adding task set 13.0 with 1 tasks 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 558 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 411 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 584 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 474 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 513 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 520 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 477 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 517 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 601 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 577 18/01/22 15:39:15 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 13.0 (TID 13, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 712 18/01/22 15:39:15 INFO executor.Executor: Running task 0.0 in stage 13.0 (TID 13) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 437 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 702 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 656 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 509 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 530 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_9_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 9.5 KB, free: 366.2 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 658 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 457 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 486 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 583 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 652 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 444 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 434 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 423 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 632 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 688 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 684 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned shuffle 2 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 682 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 505 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 523 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 604 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_6_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 4.3 KB, free: 366.2 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 649 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 706 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 707 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 493 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 533 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 489 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 542 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 431 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 576 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 418 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 619 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 552 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 709 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 549 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 653 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 494 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 705 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned shuffle 3 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 659 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 488 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 452 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 635 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 506 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 412 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 644 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 610 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 446 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 715 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 591 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 575 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 611 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 667 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 495 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 666 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 499 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_7_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 8.7 KB, free: 366.2 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 700 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 436 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 500 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 447 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 497 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 680 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 674 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 425 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 413 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 538 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 605 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 454 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 496 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 456 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 592 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 615 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 440 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 532 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 568 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 695 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 609 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 566 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 713 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 550 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 630 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 487 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 677 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 603 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 579 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 637 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 699 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 511 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 691 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 646 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 460 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 634 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 553 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 607 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 459 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 588 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 627 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 422 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 639 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 631 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 704 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 545 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 638 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 521 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 663 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 514 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 616 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 690 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 594 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 563 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 485 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 433 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 620 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 622 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 510 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 614 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 711 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 612 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 582 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_11_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 22.5 KB, free: 366.2 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 435 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 585 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 560 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 462 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 466 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 501 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 564 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 623 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 654 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 420 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 476 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 572 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 461 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 524 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 561 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 571 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 503 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 640 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 442 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 629 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 491 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 602 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 647 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 597 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 694 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 527 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 471 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 650 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 551 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 600 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 512 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 539 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 613 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 464 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 448 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 519 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 617 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 671 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 428 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 641 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 645 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 625 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 429 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 714 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_3_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 14.9 KB, free: 366.2 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 547 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 569 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 701 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 642 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 668 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 534 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 540 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 661 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 417 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 703 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 432 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 669 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 643 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 415 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 618 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 504 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 570 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 482 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 678 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 473 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 685 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 419 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 498 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 624 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 472 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 679 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 662 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 451 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 458 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 492 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 516 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 636 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 697 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 480 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 481 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 554 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 574 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 693 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 529 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 426 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 587 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 657 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 598 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 518 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 528 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 633 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 559 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 686 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 573 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 525 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 424 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 484 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 676 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 427 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 651 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 515 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 681 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 698 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 593 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 470 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 562 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 689 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 581 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_8_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 3.7 KB, free: 366.2 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 565 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 672 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 450 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 544 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 536 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 608 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 578 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 416 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_12_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 25.3 KB, free: 366.2 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 696 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 660 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned shuffle 1 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 483 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_4_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 16.8 KB, free: 366.3 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 522 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 673 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 468 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 449 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 421 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 430 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 490 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 465 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 463 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 502 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 675 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 692 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 548 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 599 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 478 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 475 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 621 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 443 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 664 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 626 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 687 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 555 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 508 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 590 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 683 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 580 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 453 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 414 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 557 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_10_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 4.7 KB, free: 366.3 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 595 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 628 18/01/22 15:39:15 INFO storage.BlockManagerInfo: Removed broadcast_5_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 26.5 KB, free: 366.3 MB) 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 556 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 537 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 531 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 455 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 655 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 526 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 467 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 567 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 479 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 541 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 507 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 586 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 543 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 606 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 441 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 445 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 535 18/01/22 15:39:15 INFO spark.ContextCleaner: Cleaned accumulator 708 18/01/22 15:39:16 INFO r.RRunner: Times: boot = 0.358 s, init = 0.750 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.060 s, total = 1.188 s 18/01/22 15:39:16 INFO executor.Executor: Finished task 0.0 in stage 13.0 (TID 13). 1419 bytes result sent to driver 18/01/22 15:39:16 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 13.0 (TID 13) in 1234 ms on localhost (executor driver) (1/1) 18/01/22 15:39:16 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 13.0, whose tasks have all completed, from pool 18/01/22 15:39:16 INFO scheduler.DAGScheduler: ShuffleMapStage 13 (first at GeneralizedLinearRegression.scala:1373) finished in 1.270 s 18/01/22 15:39:16 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:16 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:16 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 14) 18/01/22 15:39:16 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Submitting ResultStage 14 (MapPartitionsRDD[62] at first at GeneralizedLinearRegression.scala:1373), which has no missing parents 18/01/22 15:39:16 INFO memory.MemoryStore: Block broadcast_14 stored as values in memory (estimated size 8.6 KB, free 366.3 MB) 18/01/22 15:39:16 INFO memory.MemoryStore: Block broadcast_14_piece0 stored as bytes in memory (estimated size 4.3 KB, free 366.3 MB) 18/01/22 15:39:16 INFO storage.BlockManagerInfo: Added broadcast_14_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 4.3 KB, free: 366.3 MB) 18/01/22 15:39:16 INFO spark.SparkContext: Created broadcast 14 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 14 (MapPartitionsRDD[62] at first at GeneralizedLinearRegression.scala:1373) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:16 INFO scheduler.TaskSchedulerImpl: Adding task set 14.0 with 1 tasks 18/01/22 15:39:16 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 14.0 (TID 14, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:16 INFO executor.Executor: Running task 0.0 in stage 14.0 (TID 14) 18/01/22 15:39:16 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:16 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms 18/01/22 15:39:16 INFO executor.Executor: Finished task 0.0 in stage 14.0 (TID 14). 1509 bytes result sent to driver 18/01/22 15:39:16 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 14.0 (TID 14) in 10 ms on localhost (executor driver) (1/1) 18/01/22 15:39:16 INFO scheduler.DAGScheduler: ResultStage 14 (first at GeneralizedLinearRegression.scala:1373) finished in 0.018 s 18/01/22 15:39:16 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 14.0, whose tasks have all completed, from pool 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Job 9 finished: first at GeneralizedLinearRegression.scala:1373, took 1.307759 s 18/01/22 15:39:16 INFO spark.SparkContext: Starting job: sum at GeneralizedLinearRegression.scala:696 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Got job 10 (sum at GeneralizedLinearRegression.scala:696) with 1 output partitions 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Final stage: ResultStage 15 (sum at GeneralizedLinearRegression.scala:696) 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Submitting ResultStage 15 (MapPartitionsRDD[68] at map at GeneralizedLinearRegression.scala:696), which has no missing parents 18/01/22 15:39:16 INFO memory.MemoryStore: Block broadcast_15 stored as values in memory (estimated size 41.2 KB, free 366.2 MB) 18/01/22 15:39:16 INFO memory.MemoryStore: Block broadcast_15_piece0 stored as bytes in memory (estimated size 17.6 KB, free 366.2 MB) 18/01/22 15:39:16 INFO storage.BlockManagerInfo: Added broadcast_15_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 17.6 KB, free: 366.3 MB) 18/01/22 15:39:16 INFO spark.SparkContext: Created broadcast 15 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:16 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 15 (MapPartitionsRDD[68] at map at GeneralizedLinearRegression.scala:696) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:16 INFO scheduler.TaskSchedulerImpl: Adding task set 15.0 with 1 tasks 18/01/22 15:39:16 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 15.0 (TID 15, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:16 INFO executor.Executor: Running task 0.0 in stage 15.0 (TID 15) 18/01/22 15:39:17 INFO r.RRunner: Times: boot = 0.256 s, init = 0.820 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.080 s, total = 1.176 s 18/01/22 15:39:17 INFO executor.Executor: Finished task 0.0 in stage 15.0 (TID 15). 1104 bytes result sent to driver 18/01/22 15:39:17 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 15.0 (TID 15) in 1185 ms on localhost (executor driver) (1/1) 18/01/22 15:39:17 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 15.0, whose tasks have all completed, from pool 18/01/22 15:39:17 INFO scheduler.DAGScheduler: ResultStage 15 (sum at GeneralizedLinearRegression.scala:696) finished in 1.199 s 18/01/22 15:39:17 INFO scheduler.DAGScheduler: Job 10 finished: sum at GeneralizedLinearRegression.scala:696, took 1.206371 s 18/01/22 15:39:18 INFO codegen.CodeGenerator: Code generated in 20.940456 ms 18/01/22 15:39:18 INFO spark.SparkContext: Starting job: dfToCols at NativeMethodAccessorImpl.java:0 18/01/22 15:39:18 INFO scheduler.DAGScheduler: Got job 11 (dfToCols at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:18 INFO scheduler.DAGScheduler: Final stage: ResultStage 16 (dfToCols at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:18 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:18 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:18 INFO scheduler.DAGScheduler: Submitting ResultStage 16 (MapPartitionsRDD[72] at dfToCols at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:39:18 INFO memory.MemoryStore: Block broadcast_16 stored as values in memory (estimated size 65.2 KB, free 366.1 MB) 18/01/22 15:39:18 INFO memory.MemoryStore: Block broadcast_16_piece0 stored as bytes in memory (estimated size 25.4 KB, free 366.1 MB) 18/01/22 15:39:18 INFO storage.BlockManagerInfo: Added broadcast_16_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 25.4 KB, free: 366.2 MB) 18/01/22 15:39:18 INFO spark.SparkContext: Created broadcast 16 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:18 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 16 (MapPartitionsRDD[72] at dfToCols at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:18 INFO scheduler.TaskSchedulerImpl: Adding task set 16.0 with 1 tasks 18/01/22 15:39:18 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 16.0 (TID 16, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:18 INFO executor.Executor: Running task 0.0 in stage 16.0 (TID 16) 18/01/22 15:39:19 INFO executor.Executor: Finished task 0.0 in stage 16.0 (TID 16). 1219 bytes result sent to driver 18/01/22 15:39:19 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 16.0 (TID 16) in 1020 ms on localhost (executor driver) (1/1) 18/01/22 15:39:19 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 16.0, whose tasks have all completed, from pool 18/01/22 15:39:19 INFO scheduler.DAGScheduler: ResultStage 16 (dfToCols at NativeMethodAccessorImpl.java:0) finished in 1.037 s 18/01/22 15:39:19 INFO scheduler.DAGScheduler: Job 11 finished: dfToCols at NativeMethodAccessorImpl.java:0, took 1.043692 s 18/01/22 15:39:19 INFO codegen.CodeGenerator: Code generated in 21.487069 ms 18/01/22 15:39:19 INFO spark.SparkContext: Starting job: dfToCols at NativeMethodAccessorImpl.java:0 18/01/22 15:39:19 INFO scheduler.DAGScheduler: Got job 12 (dfToCols at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:19 INFO scheduler.DAGScheduler: Final stage: ResultStage 17 (dfToCols at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:19 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:19 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:19 INFO scheduler.DAGScheduler: Submitting ResultStage 17 (MapPartitionsRDD[75] at dfToCols at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:39:19 INFO memory.MemoryStore: Block broadcast_17 stored as values in memory (estimated size 64.6 KB, free 366.0 MB) 18/01/22 15:39:19 INFO memory.MemoryStore: Block broadcast_17_piece0 stored as bytes in memory (estimated size 25.3 KB, free 366.0 MB) 18/01/22 15:39:19 INFO storage.BlockManagerInfo: Added broadcast_17_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 25.3 KB, free: 366.2 MB) 18/01/22 15:39:19 INFO spark.SparkContext: Created broadcast 17 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:19 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 17 (MapPartitionsRDD[75] at dfToCols at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:19 INFO scheduler.TaskSchedulerImpl: Adding task set 17.0 with 1 tasks 18/01/22 15:39:19 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 17.0 (TID 17, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:19 INFO executor.Executor: Running task 0.0 in stage 17.0 (TID 17) 18/01/22 15:39:20 INFO r.RRunner: Times: boot = 0.372 s, init = 0.880 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.080 s, total = 1.342 s 18/01/22 15:39:20 INFO executor.Executor: Finished task 0.0 in stage 17.0 (TID 17). 2261 bytes result sent to driver 18/01/22 15:39:20 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 17.0 (TID 17) in 1347 ms on localhost (executor driver) (1/1) 18/01/22 15:39:20 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 17.0, whose tasks have all completed, from pool 18/01/22 15:39:20 INFO scheduler.DAGScheduler: ResultStage 17 (dfToCols at NativeMethodAccessorImpl.java:0) finished in 1.362 s 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Job 12 finished: dfToCols at NativeMethodAccessorImpl.java:0, took 1.373131 s 18/01/22 15:39:20 INFO spark.SparkContext: Starting job: collectPartitions at NativeMethodAccessorImpl.java:0 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Got job 13 (collectPartitions at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Final stage: ResultStage 18 (collectPartitions at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Submitting ResultStage 18 (ParallelCollectionRDD[76] at parallelize at RRDD.scala:151), which has no missing parents 18/01/22 15:39:20 INFO memory.MemoryStore: Block broadcast_18 stored as values in memory (estimated size 1424.0 B, free 366.0 MB) 18/01/22 15:39:20 INFO memory.MemoryStore: Block broadcast_18_piece0 stored as bytes in memory (estimated size 959.0 B, free 366.0 MB) 18/01/22 15:39:20 INFO storage.BlockManagerInfo: Added broadcast_18_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 959.0 B, free: 366.2 MB) 18/01/22 15:39:20 INFO spark.SparkContext: Created broadcast 18 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 18 (ParallelCollectionRDD[76] at parallelize at RRDD.scala:151) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:20 INFO scheduler.TaskSchedulerImpl: Adding task set 18.0 with 1 tasks 18/01/22 15:39:20 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 18.0 (TID 18, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:20 INFO executor.Executor: Running task 0.0 in stage 18.0 (TID 18) 18/01/22 15:39:20 INFO executor.Executor: Finished task 0.0 in stage 18.0 (TID 18). 4669 bytes result sent to driver 18/01/22 15:39:20 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 18.0 (TID 18) in 3 ms on localhost (executor driver) (1/1) 18/01/22 15:39:20 INFO scheduler.DAGScheduler: ResultStage 18 (collectPartitions at NativeMethodAccessorImpl.java:0) finished in 0.013 s 18/01/22 15:39:20 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 18.0, whose tasks have all completed, from pool 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Job 13 finished: collectPartitions at NativeMethodAccessorImpl.java:0, took 0.019675 s 18/01/22 15:39:20 INFO codegen.CodeGenerator: Code generated in 27.54619 ms 18/01/22 15:39:20 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:379 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Got job 14 (first at GeneralizedLinearRegression.scala:379) with 1 output partitions 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Final stage: ResultStage 19 (first at GeneralizedLinearRegression.scala:379) 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:20 INFO scheduler.DAGScheduler: Submitting ResultStage 19 (MapPartitionsRDD[83] at first at GeneralizedLinearRegression.scala:379), which has no missing parents 18/01/22 15:39:20 INFO memory.MemoryStore: Block broadcast_19 stored as values in memory (estimated size 31.6 KB, free 366.0 MB) 18/01/22 15:39:21 INFO memory.MemoryStore: Block broadcast_19_piece0 stored as bytes in memory (estimated size 11.1 KB, free 366.0 MB) 18/01/22 15:39:21 INFO storage.BlockManagerInfo: Added broadcast_19_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 11.1 KB, free: 366.2 MB) 18/01/22 15:39:21 INFO spark.SparkContext: Created broadcast 19 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:21 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 19 (MapPartitionsRDD[83] at first at GeneralizedLinearRegression.scala:379) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:21 INFO scheduler.TaskSchedulerImpl: Adding task set 19.0 with 1 tasks 18/01/22 15:39:21 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 19.0 (TID 19, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:21 INFO executor.Executor: Running task 0.0 in stage 19.0 (TID 19) 18/01/22 15:39:22 INFO codegen.CodeGenerator: Code generated in 8.429172 ms 18/01/22 15:39:22 INFO codegen.CodeGenerator: Code generated in 17.38709 ms 18/01/22 15:39:22 INFO executor.Executor: Finished task 0.0 in stage 19.0 (TID 19). 1092 bytes result sent to driver 18/01/22 15:39:22 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 19.0 (TID 19) in 1176 ms on localhost (executor driver) (1/1) 18/01/22 15:39:22 INFO scheduler.DAGScheduler: ResultStage 19 (first at GeneralizedLinearRegression.scala:379) finished in 1.191 s 18/01/22 15:39:22 INFO scheduler.DAGScheduler: Job 14 finished: first at GeneralizedLinearRegression.scala:379, took 1.199037 s 18/01/22 15:39:22 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 19.0, whose tasks have all completed, from pool 18/01/22 15:39:22 INFO codegen.CodeGenerator: Code generated in 27.112092 ms 18/01/22 15:39:22 INFO util.Instrumentation: GeneralizedLinearRegression-glm_0aef2c9e4757-551302425-2: training: numPartitions=1 storageLevel=StorageLevel(1 replicas) 18/01/22 15:39:22 INFO util.Instrumentation: GeneralizedLinearRegression-glm_0aef2c9e4757-551302425-2: {"featuresCol":"features","fitIntercept":true,"link":"inverse","regParam":0.0,"tol":1.0E-6,"family":"gamma","maxIter":25} 18/01/22 15:39:22 INFO util.Instrumentation: GeneralizedLinearRegression-glm_0aef2c9e4757-551302425-2: {"numFeatures":1} 18/01/22 15:39:22 INFO codegen.CodeGenerator: Code generated in 23.393754 ms 18/01/22 15:39:22 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:22 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:22 INFO scheduler.DAGScheduler: Got job 15 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:22 INFO scheduler.DAGScheduler: Final stage: ResultStage 20 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:22 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:22 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:22 INFO scheduler.DAGScheduler: Submitting ResultStage 20 (MapPartitionsRDD[94] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:22 INFO memory.MemoryStore: Block broadcast_20 stored as values in memory (estimated size 34.1 KB, free 365.9 MB) 18/01/22 15:39:22 INFO memory.MemoryStore: Block broadcast_20_piece0 stored as bytes in memory (estimated size 13.2 KB, free 365.9 MB) 18/01/22 15:39:22 INFO storage.BlockManagerInfo: Added broadcast_20_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.2 KB, free: 366.2 MB) 18/01/22 15:39:22 INFO spark.SparkContext: Created broadcast 20 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:22 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 20 (MapPartitionsRDD[94] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:22 INFO scheduler.TaskSchedulerImpl: Adding task set 20.0 with 1 tasks 18/01/22 15:39:22 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 20.0 (TID 20, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:22 INFO executor.Executor: Running task 0.0 in stage 20.0 (TID 20) 18/01/22 15:39:23 INFO r.RRunner: Times: boot = 0.236 s, init = 0.800 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.080 s, total = 1.126 s 18/01/22 15:39:23 INFO executor.Executor: Finished task 0.0 in stage 20.0 (TID 20). 1351 bytes result sent to driver 18/01/22 15:39:23 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 20.0 (TID 20) in 1133 ms on localhost (executor driver) (1/1) 18/01/22 15:39:23 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 20.0, whose tasks have all completed, from pool 18/01/22 15:39:23 INFO scheduler.DAGScheduler: ResultStage 20 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.144 s 18/01/22 15:39:23 INFO scheduler.DAGScheduler: Job 15 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.151752 s 18/01/22 15:39:23 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:23 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:23 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:23 INFO scheduler.DAGScheduler: Got job 16 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:23 INFO scheduler.DAGScheduler: Final stage: ResultStage 21 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:23 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:23 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:23 INFO scheduler.DAGScheduler: Submitting ResultStage 21 (MapPartitionsRDD[96] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:23 INFO memory.MemoryStore: Block broadcast_21 stored as values in memory (estimated size 34.7 KB, free 365.9 MB) 18/01/22 15:39:23 INFO memory.MemoryStore: Block broadcast_21_piece0 stored as bytes in memory (estimated size 13.6 KB, free 365.9 MB) 18/01/22 15:39:23 INFO storage.BlockManagerInfo: Added broadcast_21_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.6 KB, free: 366.2 MB) 18/01/22 15:39:23 INFO spark.SparkContext: Created broadcast 21 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:23 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 21 (MapPartitionsRDD[96] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:23 INFO scheduler.TaskSchedulerImpl: Adding task set 21.0 with 1 tasks 18/01/22 15:39:23 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 21.0 (TID 21, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:23 INFO executor.Executor: Running task 0.0 in stage 21.0 (TID 21) 18/01/22 15:39:24 INFO r.RRunner: Times: boot = 0.220 s, init = 0.720 s, broadcast = 0.020 s, read-input = 0.000 s, compute = 0.000 s, write-output = 0.050 s, total = 1.010 s 18/01/22 15:39:24 INFO executor.Executor: Finished task 0.0 in stage 21.0 (TID 21). 1394 bytes result sent to driver 18/01/22 15:39:24 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 21.0 (TID 21) in 1013 ms on localhost (executor driver) (1/1) 18/01/22 15:39:24 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 21.0, whose tasks have all completed, from pool 18/01/22 15:39:24 INFO scheduler.DAGScheduler: ResultStage 21 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.021 s 18/01/22 15:39:24 INFO scheduler.DAGScheduler: Job 16 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.032732 s 18/01/22 15:39:24 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:24 INFO optim.IterativelyReweightedLeastSquares: Iteration 0 : relative tolerance = 0.1507289920676096 18/01/22 15:39:24 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:24 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:24 INFO scheduler.DAGScheduler: Got job 17 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:24 INFO scheduler.DAGScheduler: Final stage: ResultStage 22 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:24 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:24 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:24 INFO scheduler.DAGScheduler: Submitting ResultStage 22 (MapPartitionsRDD[98] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:24 INFO memory.MemoryStore: Block broadcast_22 stored as values in memory (estimated size 34.8 KB, free 365.9 MB) 18/01/22 15:39:24 INFO memory.MemoryStore: Block broadcast_22_piece0 stored as bytes in memory (estimated size 13.7 KB, free 365.8 MB) 18/01/22 15:39:24 INFO storage.BlockManagerInfo: Added broadcast_22_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.7 KB, free: 366.2 MB) 18/01/22 15:39:24 INFO spark.SparkContext: Created broadcast 22 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:24 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 22 (MapPartitionsRDD[98] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:24 INFO scheduler.TaskSchedulerImpl: Adding task set 22.0 with 1 tasks 18/01/22 15:39:24 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 22.0 (TID 22, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:24 INFO executor.Executor: Running task 0.0 in stage 22.0 (TID 22) 18/01/22 15:39:25 INFO r.RRunner: Times: boot = 0.204 s, init = 0.730 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.050 s, total = 1.004 s 18/01/22 15:39:25 INFO executor.Executor: Finished task 0.0 in stage 22.0 (TID 22). 1394 bytes result sent to driver 18/01/22 15:39:25 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 22.0 (TID 22) in 1003 ms on localhost (executor driver) (1/1) 18/01/22 15:39:25 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 22.0, whose tasks have all completed, from pool 18/01/22 15:39:25 INFO scheduler.DAGScheduler: ResultStage 22 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.010 s 18/01/22 15:39:25 INFO scheduler.DAGScheduler: Job 17 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.022671 s 18/01/22 15:39:25 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:25 INFO optim.IterativelyReweightedLeastSquares: Iteration 1 : relative tolerance = 0.01717436577067666 18/01/22 15:39:25 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:25 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:25 INFO scheduler.DAGScheduler: Got job 18 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:25 INFO scheduler.DAGScheduler: Final stage: ResultStage 23 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:25 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:25 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:25 INFO scheduler.DAGScheduler: Submitting ResultStage 23 (MapPartitionsRDD[100] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:25 INFO memory.MemoryStore: Block broadcast_23 stored as values in memory (estimated size 34.8 KB, free 365.8 MB) 18/01/22 15:39:25 INFO memory.MemoryStore: Block broadcast_23_piece0 stored as bytes in memory (estimated size 13.7 KB, free 365.8 MB) 18/01/22 15:39:25 INFO storage.BlockManagerInfo: Added broadcast_23_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.7 KB, free: 366.2 MB) 18/01/22 15:39:25 INFO spark.SparkContext: Created broadcast 23 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:25 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 23 (MapPartitionsRDD[100] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:25 INFO scheduler.TaskSchedulerImpl: Adding task set 23.0 with 1 tasks 18/01/22 15:39:25 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 23.0 (TID 23, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:25 INFO executor.Executor: Running task 0.0 in stage 23.0 (TID 23) 18/01/22 15:39:26 INFO r.RRunner: Times: boot = 0.202 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.050 s, total = 0.992 s 18/01/22 15:39:26 INFO executor.Executor: Finished task 0.0 in stage 23.0 (TID 23). 1394 bytes result sent to driver 18/01/22 15:39:26 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 23.0 (TID 23) in 985 ms on localhost (executor driver) (1/1) 18/01/22 15:39:26 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 23.0, whose tasks have all completed, from pool 18/01/22 15:39:26 INFO scheduler.DAGScheduler: ResultStage 23 (treeAggregate at WeightedLeastSquares.scala:100) finished in 0.992 s 18/01/22 15:39:26 INFO scheduler.DAGScheduler: Job 18 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.004686 s 18/01/22 15:39:26 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:26 INFO optim.IterativelyReweightedLeastSquares: Iteration 2 : relative tolerance = 0.04632310411561036 18/01/22 15:39:26 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:26 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:26 INFO scheduler.DAGScheduler: Got job 19 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:26 INFO scheduler.DAGScheduler: Final stage: ResultStage 24 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:26 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:26 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:26 INFO scheduler.DAGScheduler: Submitting ResultStage 24 (MapPartitionsRDD[102] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:26 INFO memory.MemoryStore: Block broadcast_24 stored as values in memory (estimated size 34.8 KB, free 365.8 MB) 18/01/22 15:39:26 INFO memory.MemoryStore: Block broadcast_24_piece0 stored as bytes in memory (estimated size 13.7 KB, free 365.7 MB) 18/01/22 15:39:26 INFO storage.BlockManagerInfo: Added broadcast_24_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:26 INFO spark.SparkContext: Created broadcast 24 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:26 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 24 (MapPartitionsRDD[102] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:26 INFO scheduler.TaskSchedulerImpl: Adding task set 24.0 with 1 tasks 18/01/22 15:39:26 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 24.0 (TID 24, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:26 INFO executor.Executor: Running task 0.0 in stage 24.0 (TID 24) 18/01/22 15:39:27 INFO r.RRunner: Times: boot = 0.207 s, init = 0.700 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.050 s, total = 0.967 s 18/01/22 15:39:27 INFO executor.Executor: Finished task 0.0 in stage 24.0 (TID 24). 1394 bytes result sent to driver 18/01/22 15:39:27 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 24.0 (TID 24) in 980 ms on localhost (executor driver) (1/1) 18/01/22 15:39:27 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 24.0, whose tasks have all completed, from pool 18/01/22 15:39:27 INFO scheduler.DAGScheduler: ResultStage 24 (treeAggregate at WeightedLeastSquares.scala:100) finished in 0.992 s 18/01/22 15:39:27 INFO scheduler.DAGScheduler: Job 19 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.003226 s 18/01/22 15:39:27 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:27 INFO optim.IterativelyReweightedLeastSquares: Iteration 3 : relative tolerance = 0.05156096625432738 18/01/22 15:39:27 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:27 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:27 INFO scheduler.DAGScheduler: Got job 20 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:27 INFO scheduler.DAGScheduler: Final stage: ResultStage 25 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:27 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:27 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:27 INFO scheduler.DAGScheduler: Submitting ResultStage 25 (MapPartitionsRDD[104] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:27 INFO memory.MemoryStore: Block broadcast_25 stored as values in memory (estimated size 34.8 KB, free 365.7 MB) 18/01/22 15:39:27 INFO memory.MemoryStore: Block broadcast_25_piece0 stored as bytes in memory (estimated size 13.7 KB, free 365.7 MB) 18/01/22 15:39:27 INFO storage.BlockManagerInfo: Added broadcast_25_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:27 INFO spark.SparkContext: Created broadcast 25 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:27 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 25 (MapPartitionsRDD[104] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:27 INFO scheduler.TaskSchedulerImpl: Adding task set 25.0 with 1 tasks 18/01/22 15:39:27 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 25.0 (TID 25, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:27 INFO executor.Executor: Running task 0.0 in stage 25.0 (TID 25) 18/01/22 15:39:28 INFO r.RRunner: Times: boot = 0.233 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.000 s, compute = 0.020 s, write-output = 0.030 s, total = 1.003 s 18/01/22 15:39:28 INFO executor.Executor: Finished task 0.0 in stage 25.0 (TID 25). 1351 bytes result sent to driver 18/01/22 15:39:28 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 25.0 (TID 25) in 1010 ms on localhost (executor driver) (1/1) 18/01/22 15:39:28 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 25.0, whose tasks have all completed, from pool 18/01/22 15:39:28 INFO scheduler.DAGScheduler: ResultStage 25 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.020 s 18/01/22 15:39:28 INFO scheduler.DAGScheduler: Job 20 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.033072 s 18/01/22 15:39:28 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:28 INFO optim.IterativelyReweightedLeastSquares: Iteration 4 : relative tolerance = 0.024636285896938226 18/01/22 15:39:28 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:28 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:28 INFO scheduler.DAGScheduler: Got job 21 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:28 INFO scheduler.DAGScheduler: Final stage: ResultStage 26 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:28 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:28 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:28 INFO scheduler.DAGScheduler: Submitting ResultStage 26 (MapPartitionsRDD[106] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:28 INFO memory.MemoryStore: Block broadcast_26 stored as values in memory (estimated size 34.8 KB, free 365.7 MB) 18/01/22 15:39:28 INFO memory.MemoryStore: Block broadcast_26_piece0 stored as bytes in memory (estimated size 13.7 KB, free 365.7 MB) 18/01/22 15:39:28 INFO storage.BlockManagerInfo: Added broadcast_26_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:28 INFO spark.SparkContext: Created broadcast 26 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:28 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 26 (MapPartitionsRDD[106] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:28 INFO scheduler.TaskSchedulerImpl: Adding task set 26.0 with 1 tasks 18/01/22 15:39:28 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 26.0 (TID 26, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:28 INFO executor.Executor: Running task 0.0 in stage 26.0 (TID 26) 18/01/22 15:39:29 INFO r.RRunner: Times: boot = 0.213 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.050 s, total = 0.993 s 18/01/22 15:39:29 INFO executor.Executor: Finished task 0.0 in stage 26.0 (TID 26). 1394 bytes result sent to driver 18/01/22 15:39:29 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 26.0 (TID 26) in 993 ms on localhost (executor driver) (1/1) 18/01/22 15:39:29 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 26.0, whose tasks have all completed, from pool 18/01/22 15:39:29 INFO scheduler.DAGScheduler: ResultStage 26 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.001 s 18/01/22 15:39:29 INFO scheduler.DAGScheduler: Job 21 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.014585 s 18/01/22 15:39:29 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:29 INFO optim.IterativelyReweightedLeastSquares: Iteration 5 : relative tolerance = 0.0034321546118405433 18/01/22 15:39:29 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:29 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:29 INFO scheduler.DAGScheduler: Got job 22 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:29 INFO scheduler.DAGScheduler: Final stage: ResultStage 27 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:29 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:29 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:29 INFO scheduler.DAGScheduler: Submitting ResultStage 27 (MapPartitionsRDD[108] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:29 INFO memory.MemoryStore: Block broadcast_27 stored as values in memory (estimated size 34.8 KB, free 365.6 MB) 18/01/22 15:39:29 INFO memory.MemoryStore: Block broadcast_27_piece0 stored as bytes in memory (estimated size 13.7 KB, free 365.6 MB) 18/01/22 15:39:29 INFO storage.BlockManagerInfo: Added broadcast_27_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:29 INFO spark.SparkContext: Created broadcast 27 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:29 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 27 (MapPartitionsRDD[108] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:29 INFO scheduler.TaskSchedulerImpl: Adding task set 27.0 with 1 tasks 18/01/22 15:39:29 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 27.0 (TID 27, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:29 INFO executor.Executor: Running task 0.0 in stage 27.0 (TID 27) 18/01/22 15:39:30 INFO r.RRunner: Times: boot = 0.224 s, init = 0.700 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.050 s, total = 0.984 s 18/01/22 15:39:30 INFO executor.Executor: Finished task 0.0 in stage 27.0 (TID 27). 1351 bytes result sent to driver 18/01/22 15:39:30 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 27.0 (TID 27) in 996 ms on localhost (executor driver) (1/1) 18/01/22 15:39:30 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 27.0, whose tasks have all completed, from pool 18/01/22 15:39:30 INFO scheduler.DAGScheduler: ResultStage 27 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.006 s 18/01/22 15:39:30 INFO scheduler.DAGScheduler: Job 22 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.018652 s 18/01/22 15:39:30 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:30 INFO optim.IterativelyReweightedLeastSquares: Iteration 6 : relative tolerance = 5.874178938136687E-5 18/01/22 15:39:30 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:30 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:30 INFO scheduler.DAGScheduler: Got job 23 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:30 INFO scheduler.DAGScheduler: Final stage: ResultStage 28 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:30 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:30 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:30 INFO scheduler.DAGScheduler: Submitting ResultStage 28 (MapPartitionsRDD[110] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:30 INFO memory.MemoryStore: Block broadcast_28 stored as values in memory (estimated size 34.8 KB, free 365.6 MB) 18/01/22 15:39:30 INFO memory.MemoryStore: Block broadcast_28_piece0 stored as bytes in memory (estimated size 13.7 KB, free 365.6 MB) 18/01/22 15:39:30 INFO storage.BlockManagerInfo: Added broadcast_28_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:30 INFO spark.SparkContext: Created broadcast 28 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:30 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 28 (MapPartitionsRDD[110] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:30 INFO scheduler.TaskSchedulerImpl: Adding task set 28.0 with 1 tasks 18/01/22 15:39:30 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 28.0 (TID 28, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:30 INFO executor.Executor: Running task 0.0 in stage 28.0 (TID 28) 18/01/22 15:39:31 INFO r.RRunner: Times: boot = 0.205 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.050 s, total = 0.985 s 18/01/22 15:39:31 INFO executor.Executor: Finished task 0.0 in stage 28.0 (TID 28). 1394 bytes result sent to driver 18/01/22 15:39:31 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 28.0 (TID 28) in 989 ms on localhost (executor driver) (1/1) 18/01/22 15:39:31 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 28.0, whose tasks have all completed, from pool 18/01/22 15:39:31 INFO scheduler.DAGScheduler: ResultStage 28 (treeAggregate at WeightedLeastSquares.scala:100) finished in 0.998 s 18/01/22 15:39:31 INFO scheduler.DAGScheduler: Job 23 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.010319 s 18/01/22 15:39:31 INFO optim.WeightedLeastSquares: Number of instances: 100. 18/01/22 15:39:31 INFO optim.IterativelyReweightedLeastSquares: IRLS converged in 7 iterations. 18/01/22 15:39:31 INFO optim.IterativelyReweightedLeastSquares: Iteration 7 : relative tolerance = 1.7740893754059073E-8 18/01/22 15:39:31 INFO util.Instrumentation: GeneralizedLinearRegression-glm_0aef2c9e4757-551302425-2: training finished 18/01/22 15:39:31 INFO codegen.CodeGenerator: Code generated in 20.127379 ms 18/01/22 15:39:31 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:1366 18/01/22 15:39:31 INFO scheduler.DAGScheduler: Registering RDD 113 (first at GeneralizedLinearRegression.scala:1366) 18/01/22 15:39:31 INFO scheduler.DAGScheduler: Got job 24 (first at GeneralizedLinearRegression.scala:1366) with 1 output partitions 18/01/22 15:39:31 INFO scheduler.DAGScheduler: Final stage: ResultStage 30 (first at GeneralizedLinearRegression.scala:1366) 18/01/22 15:39:31 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 29) 18/01/22 15:39:31 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 29) 18/01/22 15:39:31 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 29 (MapPartitionsRDD[113] at first at GeneralizedLinearRegression.scala:1366), which has no missing parents 18/01/22 15:39:31 INFO memory.MemoryStore: Block broadcast_29 stored as values in memory (estimated size 54.9 KB, free 365.5 MB) 18/01/22 15:39:32 INFO memory.MemoryStore: Block broadcast_29_piece0 stored as bytes in memory (estimated size 21.7 KB, free 365.5 MB) 18/01/22 15:39:32 INFO storage.BlockManagerInfo: Added broadcast_29_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 21.7 KB, free: 366.1 MB) 18/01/22 15:39:32 INFO spark.SparkContext: Created broadcast 29 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:32 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 29 (MapPartitionsRDD[113] at first at GeneralizedLinearRegression.scala:1366) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:32 INFO scheduler.TaskSchedulerImpl: Adding task set 29.0 with 1 tasks 18/01/22 15:39:32 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 29.0 (TID 29, localhost, executor driver, partition 0, PROCESS_LOCAL, 11860 bytes) 18/01/22 15:39:32 INFO executor.Executor: Running task 0.0 in stage 29.0 (TID 29) 18/01/22 15:39:32 INFO r.RRunner: Times: boot = 0.214 s, init = 0.710 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.030 s, total = 0.964 s 18/01/22 15:39:33 INFO executor.Executor: Finished task 0.0 in stage 29.0 (TID 29). 1614 bytes result sent to driver 18/01/22 15:39:33 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 29.0 (TID 29) in 999 ms on localhost (executor driver) (1/1) 18/01/22 15:39:33 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 29.0, whose tasks have all completed, from pool 18/01/22 15:39:33 INFO scheduler.DAGScheduler: ShuffleMapStage 29 (first at GeneralizedLinearRegression.scala:1366) finished in 1.007 s 18/01/22 15:39:33 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:33 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:33 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 30) 18/01/22 15:39:33 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Submitting ResultStage 30 (MapPartitionsRDD[117] at first at GeneralizedLinearRegression.scala:1366), which has no missing parents 18/01/22 15:39:33 INFO memory.MemoryStore: Block broadcast_30 stored as values in memory (estimated size 8.6 KB, free 365.5 MB) 18/01/22 15:39:33 INFO memory.MemoryStore: Block broadcast_30_piece0 stored as bytes in memory (estimated size 4.3 KB, free 365.5 MB) 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Added broadcast_30_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 4.3 KB, free: 366.1 MB) 18/01/22 15:39:33 INFO spark.SparkContext: Created broadcast 30 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 30 (MapPartitionsRDD[117] at first at GeneralizedLinearRegression.scala:1366) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:33 INFO scheduler.TaskSchedulerImpl: Adding task set 30.0 with 1 tasks 18/01/22 15:39:33 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 30.0 (TID 30, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:33 INFO executor.Executor: Running task 0.0 in stage 30.0 (TID 30) 18/01/22 15:39:33 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:33 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms 18/01/22 15:39:33 INFO executor.Executor: Finished task 0.0 in stage 30.0 (TID 30). 1514 bytes result sent to driver 18/01/22 15:39:33 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 30.0 (TID 30) in 6 ms on localhost (executor driver) (1/1) 18/01/22 15:39:33 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 30.0, whose tasks have all completed, from pool 18/01/22 15:39:33 INFO scheduler.DAGScheduler: ResultStage 30 (first at GeneralizedLinearRegression.scala:1366) finished in 0.018 s 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Job 24 finished: first at GeneralizedLinearRegression.scala:1366, took 1.039473 s 18/01/22 15:39:33 INFO spark.SparkContext: Starting job: count at GeneralizedLinearRegression.scala:1207 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Registering RDD 120 (count at GeneralizedLinearRegression.scala:1207) 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Got job 25 (count at GeneralizedLinearRegression.scala:1207) with 1 output partitions 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Final stage: ResultStage 32 (count at GeneralizedLinearRegression.scala:1207) 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 31) 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 31) 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 31 (MapPartitionsRDD[120] at count at GeneralizedLinearRegression.scala:1207), which has no missing parents 18/01/22 15:39:33 INFO memory.MemoryStore: Block broadcast_31 stored as values in memory (estimated size 18.3 KB, free 365.5 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1036 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1156 18/01/22 15:39:33 INFO memory.MemoryStore: Block broadcast_31_piece0 stored as bytes in memory (estimated size 8.2 KB, free 365.4 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 988 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1012 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1005 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1201 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1035 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 968 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1138 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 987 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1198 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 923 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 979 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 863 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1007 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 831 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 838 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 885 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 969 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 945 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1084 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Added broadcast_31_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 8.2 KB, free: 366.1 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1093 18/01/22 15:39:33 INFO spark.SparkContext: Created broadcast 31 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 938 18/01/22 15:39:33 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 31 (MapPartitionsRDD[120] at count at GeneralizedLinearRegression.scala:1207) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:33 INFO scheduler.TaskSchedulerImpl: Adding task set 31.0 with 1 tasks 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_15_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 17.6 KB, free: 366.1 MB) 18/01/22 15:39:33 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 31.0 (TID 31, localhost, executor driver, partition 0, PROCESS_LOCAL, 11860 bytes) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1150 18/01/22 15:39:33 INFO executor.Executor: Running task 0.0 in stage 31.0 (TID 31) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 861 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1203 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1091 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1131 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 974 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1119 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_22_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 842 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1018 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 851 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 921 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1136 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 772 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 782 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1034 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 993 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 811 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 964 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1105 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 817 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 833 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 899 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 992 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 884 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 900 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 757 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 810 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 723 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 859 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 865 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 769 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 905 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1104 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1111 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 883 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1002 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 956 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1025 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 719 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1017 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 794 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 989 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 808 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 946 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 806 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 906 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 980 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1182 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1107 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 828 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1057 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1143 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 860 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 981 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1096 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1118 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 813 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 868 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1165 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1021 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1164 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1061 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1176 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 767 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 958 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned shuffle 4 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1041 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1157 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1187 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 876 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1073 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1132 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 751 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1155 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 984 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1141 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1134 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 812 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 844 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1081 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 814 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 934 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 927 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 931 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1127 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 748 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1062 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1072 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 736 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_18_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 959.0 B, free: 366.1 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 864 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 879 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 770 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 893 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 975 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 901 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 796 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1066 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1161 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 985 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 784 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 832 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1193 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1207 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_26_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 894 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1086 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 777 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1199 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1009 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 875 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 761 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1070 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 824 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 756 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 997 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1109 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1004 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1043 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1166 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1108 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1185 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 976 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1169 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1044 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1060 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 928 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 765 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 847 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1186 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 732 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 738 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 823 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 801 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 854 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 790 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 781 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1126 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 754 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 835 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 897 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 967 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 983 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1184 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 759 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1145 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 866 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1205 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 867 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 924 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 820 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 780 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 727 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1092 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 821 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1038 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 953 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1144 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 878 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 841 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 852 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 898 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 950 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 918 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1059 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 724 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 971 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 970 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 797 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 933 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 771 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 942 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 788 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1033 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1030 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 805 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1065 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 815 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1063 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1080 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1154 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1175 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 972 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 839 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1190 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 737 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1153 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1208 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 858 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 816 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 870 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1016 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1023 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_28_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 783 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 787 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1142 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_24_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.7 KB, free: 366.1 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 845 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1046 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 766 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1050 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_16_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 25.4 KB, free: 366.2 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1052 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1085 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1037 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 720 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1058 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1158 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 965 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 829 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1032 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 760 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 735 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1148 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 895 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1129 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 947 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 936 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1048 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 874 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1180 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1079 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 730 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_21_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.6 KB, free: 366.2 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 982 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1146 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1049 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 999 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 773 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1055 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 764 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 768 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1181 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1015 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1031 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 940 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_25_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.7 KB, free: 366.2 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 886 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 973 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 871 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1110 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 740 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 744 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 966 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1167 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1172 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 954 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 798 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_13_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 8.9 KB, free: 366.2 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1020 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1188 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 743 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_20_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.2 KB, free: 366.2 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 994 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 800 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 911 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 903 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 795 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 957 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1010 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1113 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 799 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 963 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1099 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1112 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1179 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 909 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1197 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 904 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1103 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 728 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 857 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1177 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 792 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_29_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 21.7 KB, free: 366.2 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 746 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 862 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1090 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned shuffle 5 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1160 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 752 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_30_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 4.3 KB, free: 366.2 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 962 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1149 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1097 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1045 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1098 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1029 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 916 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 802 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 930 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_17_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 25.3 KB, free: 366.3 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1123 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1069 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1077 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 882 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1124 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1133 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 785 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1128 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 753 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 818 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1117 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1173 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 869 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 793 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1056 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1027 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 948 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1076 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1200 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 749 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1006 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1189 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 778 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1151 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1054 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1089 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 910 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1047 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 733 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1075 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 998 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 977 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 891 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 937 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 791 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 725 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_14_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 4.3 KB, free: 366.3 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1191 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1042 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 986 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 762 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1053 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 902 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 843 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 721 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 922 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 809 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 955 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1008 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 877 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1014 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 961 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1137 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1140 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 741 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 776 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 929 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 836 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 925 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 908 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1094 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 834 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 907 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 729 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1078 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 856 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1159 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 850 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 944 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1022 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1087 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1064 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 827 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 840 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 951 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1174 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1162 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1202 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 996 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 978 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1001 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_23_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.7 KB, free: 366.3 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 825 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 892 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 786 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 745 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 755 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 995 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1183 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1196 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 848 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 779 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 920 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1026 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 912 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 747 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1000 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 726 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 890 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1024 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 896 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1067 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 807 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 717 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 960 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 789 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 750 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1051 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 830 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 952 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1135 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 758 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 959 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1003 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 739 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 849 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1013 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1100 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1163 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1204 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 990 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 919 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 775 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 872 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1116 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1168 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 889 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1095 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1192 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 819 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1040 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 881 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 943 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1019 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 822 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 887 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1011 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_27_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 13.7 KB, free: 366.3 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 853 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 742 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 722 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 939 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1206 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1068 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 926 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1195 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1121 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1106 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 718 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 803 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 888 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1147 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 991 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1139 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1114 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1171 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1102 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 855 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1122 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1178 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 932 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1082 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1120 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1101 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 804 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 731 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 949 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 734 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 837 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1074 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 774 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1088 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1194 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1130 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1170 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 880 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1115 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 763 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 826 18/01/22 15:39:33 INFO storage.BlockManagerInfo: Removed broadcast_19_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 11.1 KB, free: 366.3 MB) 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 873 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 935 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 846 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 941 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1125 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1028 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 913 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1152 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1071 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 917 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1039 18/01/22 15:39:33 INFO spark.ContextCleaner: Cleaned accumulator 1083 18/01/22 15:39:34 INFO r.RRunner: Times: boot = 0.327 s, init = 0.750 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.050 s, total = 1.137 s 18/01/22 15:39:34 INFO executor.Executor: Finished task 0.0 in stage 31.0 (TID 31). 1419 bytes result sent to driver 18/01/22 15:39:34 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 31.0 (TID 31) in 1164 ms on localhost (executor driver) (1/1) 18/01/22 15:39:34 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 31.0, whose tasks have all completed, from pool 18/01/22 15:39:34 INFO scheduler.DAGScheduler: ShuffleMapStage 31 (count at GeneralizedLinearRegression.scala:1207) finished in 1.211 s 18/01/22 15:39:34 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:34 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:34 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 32) 18/01/22 15:39:34 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Submitting ResultStage 32 (MapPartitionsRDD[123] at count at GeneralizedLinearRegression.scala:1207), which has no missing parents 18/01/22 15:39:34 INFO memory.MemoryStore: Block broadcast_32 stored as values in memory (estimated size 7.1 KB, free 366.3 MB) 18/01/22 15:39:34 INFO memory.MemoryStore: Block broadcast_32_piece0 stored as bytes in memory (estimated size 3.7 KB, free 366.3 MB) 18/01/22 15:39:34 INFO storage.BlockManagerInfo: Added broadcast_32_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 3.7 KB, free: 366.3 MB) 18/01/22 15:39:34 INFO spark.SparkContext: Created broadcast 32 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 32 (MapPartitionsRDD[123] at count at GeneralizedLinearRegression.scala:1207) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:34 INFO scheduler.TaskSchedulerImpl: Adding task set 32.0 with 1 tasks 18/01/22 15:39:34 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 32.0 (TID 32, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:34 INFO executor.Executor: Running task 0.0 in stage 32.0 (TID 32) 18/01/22 15:39:34 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:34 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms 18/01/22 15:39:34 INFO executor.Executor: Finished task 0.0 in stage 32.0 (TID 32). 1696 bytes result sent to driver 18/01/22 15:39:34 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 32.0 (TID 32) in 5 ms on localhost (executor driver) (1/1) 18/01/22 15:39:34 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 32.0, whose tasks have all completed, from pool 18/01/22 15:39:34 INFO scheduler.DAGScheduler: ResultStage 32 (count at GeneralizedLinearRegression.scala:1207) finished in 0.019 s 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Job 25 finished: count at GeneralizedLinearRegression.scala:1207, took 1.246269 s 18/01/22 15:39:34 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:1321 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Registering RDD 126 (first at GeneralizedLinearRegression.scala:1321) 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Got job 26 (first at GeneralizedLinearRegression.scala:1321) with 1 output partitions 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Final stage: ResultStage 34 (first at GeneralizedLinearRegression.scala:1321) 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 33) 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 33) 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 33 (MapPartitionsRDD[126] at first at GeneralizedLinearRegression.scala:1321), which has no missing parents 18/01/22 15:39:34 INFO memory.MemoryStore: Block broadcast_33 stored as values in memory (estimated size 20.7 KB, free 366.2 MB) 18/01/22 15:39:34 INFO memory.MemoryStore: Block broadcast_33_piece0 stored as bytes in memory (estimated size 9.0 KB, free 366.2 MB) 18/01/22 15:39:34 INFO storage.BlockManagerInfo: Added broadcast_33_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 9.0 KB, free: 366.3 MB) 18/01/22 15:39:34 INFO spark.SparkContext: Created broadcast 33 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:34 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 33 (MapPartitionsRDD[126] at first at GeneralizedLinearRegression.scala:1321) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:34 INFO scheduler.TaskSchedulerImpl: Adding task set 33.0 with 1 tasks 18/01/22 15:39:34 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 33.0 (TID 33, localhost, executor driver, partition 0, PROCESS_LOCAL, 11860 bytes) 18/01/22 15:39:34 INFO executor.Executor: Running task 0.0 in stage 33.0 (TID 33) 18/01/22 15:39:35 INFO r.RRunner: Times: boot = 0.206 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.050 s, total = 0.996 s 18/01/22 15:39:35 INFO executor.Executor: Finished task 0.0 in stage 33.0 (TID 33). 1462 bytes result sent to driver 18/01/22 15:39:35 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 33.0 (TID 33) in 1012 ms on localhost (executor driver) (1/1) 18/01/22 15:39:35 INFO scheduler.DAGScheduler: ShuffleMapStage 33 (first at GeneralizedLinearRegression.scala:1321) finished in 1.025 s 18/01/22 15:39:35 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:35 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 33.0, whose tasks have all completed, from pool 18/01/22 15:39:35 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:35 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 34) 18/01/22 15:39:35 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Submitting ResultStage 34 (MapPartitionsRDD[130] at first at GeneralizedLinearRegression.scala:1321), which has no missing parents 18/01/22 15:39:35 INFO memory.MemoryStore: Block broadcast_34 stored as values in memory (estimated size 9.9 KB, free 366.2 MB) 18/01/22 15:39:35 INFO memory.MemoryStore: Block broadcast_34_piece0 stored as bytes in memory (estimated size 4.7 KB, free 366.2 MB) 18/01/22 15:39:35 INFO storage.BlockManagerInfo: Added broadcast_34_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 4.7 KB, free: 366.3 MB) 18/01/22 15:39:35 INFO spark.SparkContext: Created broadcast 34 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 34 (MapPartitionsRDD[130] at first at GeneralizedLinearRegression.scala:1321) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:35 INFO scheduler.TaskSchedulerImpl: Adding task set 34.0 with 1 tasks 18/01/22 15:39:35 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 34.0 (TID 34, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:35 INFO executor.Executor: Running task 0.0 in stage 34.0 (TID 34) 18/01/22 15:39:35 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:35 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms 18/01/22 15:39:35 INFO executor.Executor: Finished task 0.0 in stage 34.0 (TID 34). 1523 bytes result sent to driver 18/01/22 15:39:35 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 34.0 (TID 34) in 9 ms on localhost (executor driver) (1/1) 18/01/22 15:39:35 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 34.0, whose tasks have all completed, from pool 18/01/22 15:39:35 INFO scheduler.DAGScheduler: ResultStage 34 (first at GeneralizedLinearRegression.scala:1321) finished in 0.022 s 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Job 26 finished: first at GeneralizedLinearRegression.scala:1321, took 1.077981 s 18/01/22 15:39:35 INFO spark.SparkContext: Starting job: sum at GeneralizedLinearRegression.scala:1340 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Got job 27 (sum at GeneralizedLinearRegression.scala:1340) with 1 output partitions 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Final stage: ResultStage 35 (sum at GeneralizedLinearRegression.scala:1340) 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Submitting ResultStage 35 (MapPartitionsRDD[135] at map at GeneralizedLinearRegression.scala:1337), which has no missing parents 18/01/22 15:39:35 INFO memory.MemoryStore: Block broadcast_35 stored as values in memory (estimated size 44.9 KB, free 366.2 MB) 18/01/22 15:39:35 INFO memory.MemoryStore: Block broadcast_35_piece0 stored as bytes in memory (estimated size 18.5 KB, free 366.2 MB) 18/01/22 15:39:35 INFO storage.BlockManagerInfo: Added broadcast_35_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 18.5 KB, free: 366.3 MB) 18/01/22 15:39:35 INFO spark.SparkContext: Created broadcast 35 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:35 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 35 (MapPartitionsRDD[135] at map at GeneralizedLinearRegression.scala:1337) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:35 INFO scheduler.TaskSchedulerImpl: Adding task set 35.0 with 1 tasks 18/01/22 15:39:35 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 35.0 (TID 35, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:35 INFO executor.Executor: Running task 0.0 in stage 35.0 (TID 35) 18/01/22 15:39:36 INFO r.RRunner: Times: boot = 0.220 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.050 s, total = 1.010 s 18/01/22 15:39:36 INFO executor.Executor: Finished task 0.0 in stage 35.0 (TID 35). 1213 bytes result sent to driver 18/01/22 15:39:36 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 35.0 (TID 35) in 1008 ms on localhost (executor driver) (1/1) 18/01/22 15:39:36 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 35.0, whose tasks have all completed, from pool 18/01/22 15:39:36 INFO scheduler.DAGScheduler: ResultStage 35 (sum at GeneralizedLinearRegression.scala:1340) finished in 1.015 s 18/01/22 15:39:36 INFO scheduler.DAGScheduler: Job 27 finished: sum at GeneralizedLinearRegression.scala:1340, took 1.023136 s 18/01/22 15:39:36 INFO codegen.CodeGenerator: Code generated in 20.080243 ms 18/01/22 15:39:36 INFO spark.SparkContext: Starting job: sum at GeneralizedLinearRegression.scala:1351 18/01/22 15:39:36 INFO scheduler.DAGScheduler: Got job 28 (sum at GeneralizedLinearRegression.scala:1351) with 1 output partitions 18/01/22 15:39:36 INFO scheduler.DAGScheduler: Final stage: ResultStage 36 (sum at GeneralizedLinearRegression.scala:1351) 18/01/22 15:39:36 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:36 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:36 INFO scheduler.DAGScheduler: Submitting ResultStage 36 (MapPartitionsRDD[140] at map at GeneralizedLinearRegression.scala:1348), which has no missing parents 18/01/22 15:39:36 INFO memory.MemoryStore: Block broadcast_36 stored as values in memory (estimated size 51.7 KB, free 366.1 MB) 18/01/22 15:39:36 INFO memory.MemoryStore: Block broadcast_36_piece0 stored as bytes in memory (estimated size 20.7 KB, free 366.1 MB) 18/01/22 15:39:36 INFO storage.BlockManagerInfo: Added broadcast_36_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 20.7 KB, free: 366.2 MB) 18/01/22 15:39:36 INFO spark.SparkContext: Created broadcast 36 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:36 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 36 (MapPartitionsRDD[140] at map at GeneralizedLinearRegression.scala:1348) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:36 INFO scheduler.TaskSchedulerImpl: Adding task set 36.0 with 1 tasks 18/01/22 15:39:36 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 36.0 (TID 36, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:36 INFO executor.Executor: Running task 0.0 in stage 36.0 (TID 36) 18/01/22 15:39:38 INFO r.RRunner: Times: boot = 0.477 s, init = 0.810 s, broadcast = 0.000 s, read-input = 0.000 s, compute = 0.000 s, write-output = 0.050 s, total = 1.337 s 18/01/22 15:39:38 INFO executor.Executor: Finished task 0.0 in stage 36.0 (TID 36). 1213 bytes result sent to driver 18/01/22 15:39:38 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 36.0 (TID 36) in 1346 ms on localhost (executor driver) (1/1) 18/01/22 15:39:38 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 36.0, whose tasks have all completed, from pool 18/01/22 15:39:38 INFO scheduler.DAGScheduler: ResultStage 36 (sum at GeneralizedLinearRegression.scala:1351) finished in 1.359 s 18/01/22 15:39:38 INFO scheduler.DAGScheduler: Job 28 finished: sum at GeneralizedLinearRegression.scala:1351, took 1.369011 s 18/01/22 15:39:38 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:1373 18/01/22 15:39:38 INFO scheduler.DAGScheduler: Registering RDD 143 (first at GeneralizedLinearRegression.scala:1373) 18/01/22 15:39:38 INFO scheduler.DAGScheduler: Got job 29 (first at GeneralizedLinearRegression.scala:1373) with 1 output partitions 18/01/22 15:39:38 INFO scheduler.DAGScheduler: Final stage: ResultStage 38 (first at GeneralizedLinearRegression.scala:1373) 18/01/22 15:39:38 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 37) 18/01/22 15:39:38 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 37) 18/01/22 15:39:38 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 37 (MapPartitionsRDD[143] at first at GeneralizedLinearRegression.scala:1373), which has no missing parents 18/01/22 15:39:38 INFO memory.MemoryStore: Block broadcast_37 stored as values in memory (estimated size 18.7 KB, free 366.1 MB) 18/01/22 15:39:38 INFO memory.MemoryStore: Block broadcast_37_piece0 stored as bytes in memory (estimated size 8.4 KB, free 366.1 MB) 18/01/22 15:39:38 INFO storage.BlockManagerInfo: Added broadcast_37_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 8.4 KB, free: 366.2 MB) 18/01/22 15:39:38 INFO spark.SparkContext: Created broadcast 37 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:38 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 37 (MapPartitionsRDD[143] at first at GeneralizedLinearRegression.scala:1373) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:38 INFO scheduler.TaskSchedulerImpl: Adding task set 37.0 with 1 tasks 18/01/22 15:39:38 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 37.0 (TID 37, localhost, executor driver, partition 0, PROCESS_LOCAL, 11860 bytes) 18/01/22 15:39:38 INFO executor.Executor: Running task 0.0 in stage 37.0 (TID 37) 18/01/22 15:39:39 INFO r.RRunner: Times: boot = 0.227 s, init = 0.740 s, broadcast = 0.000 s, read-input = 0.000 s, compute = 0.000 s, write-output = 0.060 s, total = 1.027 s 18/01/22 15:39:39 INFO executor.Executor: Finished task 0.0 in stage 37.0 (TID 37). 1505 bytes result sent to driver 18/01/22 15:39:39 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 37.0 (TID 37) in 1042 ms on localhost (executor driver) (1/1) 18/01/22 15:39:39 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 37.0, whose tasks have all completed, from pool 18/01/22 15:39:39 INFO scheduler.DAGScheduler: ShuffleMapStage 37 (first at GeneralizedLinearRegression.scala:1373) finished in 1.048 s 18/01/22 15:39:39 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:39 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:39 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 38) 18/01/22 15:39:39 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Submitting ResultStage 38 (MapPartitionsRDD[147] at first at GeneralizedLinearRegression.scala:1373), which has no missing parents 18/01/22 15:39:39 INFO memory.MemoryStore: Block broadcast_38 stored as values in memory (estimated size 8.6 KB, free 366.1 MB) 18/01/22 15:39:39 INFO memory.MemoryStore: Block broadcast_38_piece0 stored as bytes in memory (estimated size 4.3 KB, free 366.0 MB) 18/01/22 15:39:39 INFO storage.BlockManagerInfo: Added broadcast_38_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 4.3 KB, free: 366.2 MB) 18/01/22 15:39:39 INFO spark.SparkContext: Created broadcast 38 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 38 (MapPartitionsRDD[147] at first at GeneralizedLinearRegression.scala:1373) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:39 INFO scheduler.TaskSchedulerImpl: Adding task set 38.0 with 1 tasks 18/01/22 15:39:39 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 38.0 (TID 38, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:39 INFO executor.Executor: Running task 0.0 in stage 38.0 (TID 38) 18/01/22 15:39:39 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:39 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 2 ms 18/01/22 15:39:39 INFO executor.Executor: Finished task 0.0 in stage 38.0 (TID 38). 1508 bytes result sent to driver 18/01/22 15:39:39 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 38.0 (TID 38) in 10 ms on localhost (executor driver) (1/1) 18/01/22 15:39:39 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 38.0, whose tasks have all completed, from pool 18/01/22 15:39:39 INFO scheduler.DAGScheduler: ResultStage 38 (first at GeneralizedLinearRegression.scala:1373) finished in 0.026 s 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Job 29 finished: first at GeneralizedLinearRegression.scala:1373, took 1.088289 s 18/01/22 15:39:39 INFO spark.SparkContext: Starting job: sum at GeneralizedLinearRegression.scala:830 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Got job 30 (sum at GeneralizedLinearRegression.scala:830) with 1 output partitions 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Final stage: ResultStage 39 (sum at GeneralizedLinearRegression.scala:830) 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Submitting ResultStage 39 (MapPartitionsRDD[153] at map at GeneralizedLinearRegression.scala:828), which has no missing parents 18/01/22 15:39:39 INFO memory.MemoryStore: Block broadcast_39 stored as values in memory (estimated size 33.1 KB, free 366.0 MB) 18/01/22 15:39:39 INFO memory.MemoryStore: Block broadcast_39_piece0 stored as bytes in memory (estimated size 14.2 KB, free 366.0 MB) 18/01/22 15:39:39 INFO storage.BlockManagerInfo: Added broadcast_39_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 14.2 KB, free: 366.2 MB) 18/01/22 15:39:39 INFO spark.SparkContext: Created broadcast 39 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:39 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 39 (MapPartitionsRDD[153] at map at GeneralizedLinearRegression.scala:828) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:39 INFO scheduler.TaskSchedulerImpl: Adding task set 39.0 with 1 tasks 18/01/22 15:39:39 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 39.0 (TID 39, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:39 INFO executor.Executor: Running task 0.0 in stage 39.0 (TID 39) 18/01/22 15:39:40 INFO r.RRunner: Times: boot = 0.260 s, init = 0.780 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.040 s, total = 1.100 s 18/01/22 15:39:40 INFO executor.Executor: Finished task 0.0 in stage 39.0 (TID 39). 1061 bytes result sent to driver 18/01/22 15:39:40 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 39.0 (TID 39) in 1126 ms on localhost (executor driver) (1/1) 18/01/22 15:39:40 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 39.0, whose tasks have all completed, from pool 18/01/22 15:39:40 INFO scheduler.DAGScheduler: ResultStage 39 (sum at GeneralizedLinearRegression.scala:830) finished in 1.144 s 18/01/22 15:39:40 INFO scheduler.DAGScheduler: Job 30 finished: sum at GeneralizedLinearRegression.scala:830, took 1.174599 s 18/01/22 15:39:40 INFO codegen.CodeGenerator: Code generated in 14.904143 ms 18/01/22 15:39:40 INFO spark.SparkContext: Starting job: approxQuantile at NativeMethodAccessorImpl.java:0 18/01/22 15:39:40 INFO scheduler.DAGScheduler: Got job 31 (approxQuantile at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:40 INFO scheduler.DAGScheduler: Final stage: ResultStage 40 (approxQuantile at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:40 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:40 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:40 INFO scheduler.DAGScheduler: Submitting ResultStage 40 (MapPartitionsRDD[158] at approxQuantile at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:39:40 INFO memory.MemoryStore: Block broadcast_40 stored as values in memory (estimated size 54.7 KB, free 365.9 MB) 18/01/22 15:39:40 INFO memory.MemoryStore: Block broadcast_40_piece0 stored as bytes in memory (estimated size 21.4 KB, free 365.9 MB) 18/01/22 15:39:40 INFO storage.BlockManagerInfo: Added broadcast_40_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 21.4 KB, free: 366.2 MB) 18/01/22 15:39:40 INFO spark.SparkContext: Created broadcast 40 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:40 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 40 (MapPartitionsRDD[158] at approxQuantile at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:40 INFO scheduler.TaskSchedulerImpl: Adding task set 40.0 with 1 tasks 18/01/22 15:39:40 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 40.0 (TID 40, localhost, executor driver, partition 0, PROCESS_LOCAL, 11871 bytes) 18/01/22 15:39:40 INFO executor.Executor: Running task 0.0 in stage 40.0 (TID 40) 18/01/22 15:39:41 INFO r.RRunner: Times: boot = 0.223 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.050 s, total = 1.013 s 18/01/22 15:39:41 INFO executor.Executor: Finished task 0.0 in stage 40.0 (TID 40). 4067 bytes result sent to driver 18/01/22 15:39:41 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 40.0 (TID 40) in 1017 ms on localhost (executor driver) (1/1) 18/01/22 15:39:41 INFO scheduler.DAGScheduler: ResultStage 40 (approxQuantile at NativeMethodAccessorImpl.java:0) finished in 1.030 s 18/01/22 15:39:41 INFO scheduler.DAGScheduler: Job 31 finished: approxQuantile at NativeMethodAccessorImpl.java:0, took 1.037268 s 18/01/22 15:39:41 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 40.0, whose tasks have all completed, from pool 18/01/22 15:39:41 INFO spark.SparkContext: Starting job: countByValue at StringIndexer.scala:140 18/01/22 15:39:41 INFO scheduler.DAGScheduler: Registering RDD 165 (countByValue at StringIndexer.scala:140) 18/01/22 15:39:41 INFO scheduler.DAGScheduler: Got job 32 (countByValue at StringIndexer.scala:140) with 1 output partitions 18/01/22 15:39:41 INFO scheduler.DAGScheduler: Final stage: ResultStage 42 (countByValue at StringIndexer.scala:140) 18/01/22 15:39:41 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 41) 18/01/22 15:39:41 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 41) 18/01/22 15:39:41 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 41 (MapPartitionsRDD[165] at countByValue at StringIndexer.scala:140), which has no missing parents 18/01/22 15:39:41 INFO memory.MemoryStore: Block broadcast_41 stored as values in memory (estimated size 22.2 KB, free 365.9 MB) 18/01/22 15:39:41 INFO memory.MemoryStore: Block broadcast_41_piece0 stored as bytes in memory (estimated size 9.7 KB, free 365.9 MB) 18/01/22 15:39:41 INFO storage.BlockManagerInfo: Added broadcast_41_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 9.7 KB, free: 366.2 MB) 18/01/22 15:39:41 INFO spark.SparkContext: Created broadcast 41 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:41 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 41 (MapPartitionsRDD[165] at countByValue at StringIndexer.scala:140) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:41 INFO scheduler.TaskSchedulerImpl: Adding task set 41.0 with 1 tasks 18/01/22 15:39:41 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 41.0 (TID 41, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:41 INFO executor.Executor: Running task 0.0 in stage 41.0 (TID 41) 18/01/22 15:39:42 INFO r.RRunner: Times: boot = 0.214 s, init = 0.740 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.080 s, total = 1.044 s 18/01/22 15:39:42 INFO executor.Executor: Finished task 0.0 in stage 41.0 (TID 41). 1402 bytes result sent to driver 18/01/22 15:39:42 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 41.0 (TID 41) in 1070 ms on localhost (executor driver) (1/1) 18/01/22 15:39:42 INFO scheduler.DAGScheduler: ShuffleMapStage 41 (countByValue at StringIndexer.scala:140) finished in 1.083 s 18/01/22 15:39:42 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:42 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:42 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 42) 18/01/22 15:39:42 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Submitting ResultStage 42 (ShuffledRDD[166] at countByValue at StringIndexer.scala:140), which has no missing parents 18/01/22 15:39:42 INFO memory.MemoryStore: Block broadcast_42 stored as values in memory (estimated size 3.2 KB, free 365.9 MB) 18/01/22 15:39:42 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 41.0, whose tasks have all completed, from pool 18/01/22 15:39:42 INFO memory.MemoryStore: Block broadcast_42_piece0 stored as bytes in memory (estimated size 1970.0 B, free 365.9 MB) 18/01/22 15:39:42 INFO storage.BlockManagerInfo: Added broadcast_42_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 1970.0 B, free: 366.2 MB) 18/01/22 15:39:42 INFO spark.SparkContext: Created broadcast 42 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 42 (ShuffledRDD[166] at countByValue at StringIndexer.scala:140) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:42 INFO scheduler.TaskSchedulerImpl: Adding task set 42.0 with 1 tasks 18/01/22 15:39:42 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 42.0 (TID 42, localhost, executor driver, partition 0, ANY, 7649 bytes) 18/01/22 15:39:42 INFO executor.Executor: Running task 0.0 in stage 42.0 (TID 42) 18/01/22 15:39:42 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:42 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms 18/01/22 15:39:42 INFO executor.Executor: Finished task 0.0 in stage 42.0 (TID 42). 1290 bytes result sent to driver 18/01/22 15:39:42 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 42.0 (TID 42) in 16 ms on localhost (executor driver) (1/1) 18/01/22 15:39:42 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 42.0, whose tasks have all completed, from pool 18/01/22 15:39:42 INFO scheduler.DAGScheduler: ResultStage 42 (countByValue at StringIndexer.scala:140) finished in 0.025 s 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Job 32 finished: countByValue at StringIndexer.scala:140, took 1.121557 s 18/01/22 15:39:42 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:379 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Got job 33 (first at GeneralizedLinearRegression.scala:379) with 1 output partitions 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Final stage: ResultStage 43 (first at GeneralizedLinearRegression.scala:379) 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Submitting ResultStage 43 (MapPartitionsRDD[170] at first at GeneralizedLinearRegression.scala:379), which has no missing parents 18/01/22 15:39:42 INFO memory.MemoryStore: Block broadcast_43 stored as values in memory (estimated size 40.2 KB, free 365.9 MB) 18/01/22 15:39:42 INFO memory.MemoryStore: Block broadcast_43_piece0 stored as bytes in memory (estimated size 14.9 KB, free 365.8 MB) 18/01/22 15:39:42 INFO storage.BlockManagerInfo: Added broadcast_43_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 14.9 KB, free: 366.2 MB) 18/01/22 15:39:42 INFO spark.SparkContext: Created broadcast 43 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:42 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 43 (MapPartitionsRDD[170] at first at GeneralizedLinearRegression.scala:379) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:42 INFO scheduler.TaskSchedulerImpl: Adding task set 43.0 with 1 tasks 18/01/22 15:39:42 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 43.0 (TID 43, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:42 INFO executor.Executor: Running task 0.0 in stage 43.0 (TID 43) 18/01/22 15:39:44 INFO executor.Executor: Finished task 0.0 in stage 43.0 (TID 43). 1139 bytes result sent to driver 18/01/22 15:39:44 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 43.0 (TID 43) in 1699 ms on localhost (executor driver) (1/1) 18/01/22 15:39:44 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 43.0, whose tasks have all completed, from pool 18/01/22 15:39:44 INFO scheduler.DAGScheduler: ResultStage 43 (first at GeneralizedLinearRegression.scala:379) finished in 1.707 s 18/01/22 15:39:44 INFO scheduler.DAGScheduler: Job 33 finished: first at GeneralizedLinearRegression.scala:379, took 1.714215 s 18/01/22 15:39:44 INFO util.Instrumentation: GeneralizedLinearRegression-glm_6285200e2bf0-2024481417-3: training: numPartitions=1 storageLevel=StorageLevel(1 replicas) 18/01/22 15:39:44 INFO util.Instrumentation: GeneralizedLinearRegression-glm_6285200e2bf0-2024481417-3: {"featuresCol":"features","fitIntercept":true,"regParam":0.0,"tol":1.0E-6,"family":"tweedie","maxIter":25} 18/01/22 15:39:44 INFO util.Instrumentation: GeneralizedLinearRegression-glm_6285200e2bf0-2024481417-3: {"numFeatures":3} 18/01/22 15:39:44 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:44 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:44 INFO scheduler.DAGScheduler: Got job 34 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:44 INFO scheduler.DAGScheduler: Final stage: ResultStage 44 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:44 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:44 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:44 INFO scheduler.DAGScheduler: Submitting ResultStage 44 (MapPartitionsRDD[181] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:44 INFO memory.MemoryStore: Block broadcast_44 stored as values in memory (estimated size 43.2 KB, free 365.8 MB) 18/01/22 15:39:44 INFO memory.MemoryStore: Block broadcast_44_piece0 stored as bytes in memory (estimated size 17.3 KB, free 365.8 MB) 18/01/22 15:39:44 INFO storage.BlockManagerInfo: Added broadcast_44_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 17.3 KB, free: 366.1 MB) 18/01/22 15:39:44 INFO spark.SparkContext: Created broadcast 44 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:44 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 44 (MapPartitionsRDD[181] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:44 INFO scheduler.TaskSchedulerImpl: Adding task set 44.0 with 1 tasks 18/01/22 15:39:44 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 44.0 (TID 44, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:44 INFO executor.Executor: Running task 0.0 in stage 44.0 (TID 44) 18/01/22 15:39:45 INFO r.RRunner: Times: boot = 0.244 s, init = 0.760 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.070 s, total = 1.094 s 18/01/22 15:39:45 INFO executor.Executor: Finished task 0.0 in stage 44.0 (TID 44). 1509 bytes result sent to driver 18/01/22 15:39:45 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 44.0 (TID 44) in 1120 ms on localhost (executor driver) (1/1) 18/01/22 15:39:45 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 44.0, whose tasks have all completed, from pool 18/01/22 15:39:45 INFO scheduler.DAGScheduler: ResultStage 44 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.140 s 18/01/22 15:39:45 INFO scheduler.DAGScheduler: Job 34 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.154367 s 18/01/22 15:39:45 INFO optim.WeightedLeastSquares: Number of instances: 150. 18/01/22 15:39:45 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:45 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:45 INFO scheduler.DAGScheduler: Got job 35 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:45 INFO scheduler.DAGScheduler: Final stage: ResultStage 45 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:45 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:45 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:45 INFO scheduler.DAGScheduler: Submitting ResultStage 45 (MapPartitionsRDD[183] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:45 INFO memory.MemoryStore: Block broadcast_45 stored as values in memory (estimated size 43.8 KB, free 365.7 MB) 18/01/22 15:39:45 INFO memory.MemoryStore: Block broadcast_45_piece0 stored as bytes in memory (estimated size 17.6 KB, free 365.7 MB) 18/01/22 15:39:45 INFO storage.BlockManagerInfo: Added broadcast_45_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 17.6 KB, free: 366.1 MB) 18/01/22 15:39:45 INFO spark.SparkContext: Created broadcast 45 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:45 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 45 (MapPartitionsRDD[183] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:45 INFO scheduler.TaskSchedulerImpl: Adding task set 45.0 with 1 tasks 18/01/22 15:39:45 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 45.0 (TID 45, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:45 INFO executor.Executor: Running task 0.0 in stage 45.0 (TID 45) 18/01/22 15:39:47 INFO r.RRunner: Times: boot = 0.231 s, init = 0.740 s, broadcast = 0.000 s, read-input = 0.010 s, compute = 0.000 s, write-output = 0.080 s, total = 1.061 s 18/01/22 15:39:47 INFO executor.Executor: Finished task 0.0 in stage 45.0 (TID 45). 1466 bytes result sent to driver 18/01/22 15:39:47 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 45.0 (TID 45) in 1057 ms on localhost (executor driver) (1/1) 18/01/22 15:39:47 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 45.0, whose tasks have all completed, from pool 18/01/22 15:39:47 INFO scheduler.DAGScheduler: ResultStage 45 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.062 s 18/01/22 15:39:47 INFO scheduler.DAGScheduler: Job 35 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.078057 s 18/01/22 15:39:47 INFO optim.WeightedLeastSquares: Number of instances: 150. 18/01/22 15:39:47 INFO optim.IterativelyReweightedLeastSquares: Iteration 0 : relative tolerance = 0.0040364489895034494 18/01/22 15:39:47 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:47 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:47 INFO scheduler.DAGScheduler: Got job 36 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:47 INFO scheduler.DAGScheduler: Final stage: ResultStage 46 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:47 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:47 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:47 INFO scheduler.DAGScheduler: Submitting ResultStage 46 (MapPartitionsRDD[185] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:47 INFO memory.MemoryStore: Block broadcast_46 stored as values in memory (estimated size 43.9 KB, free 365.7 MB) 18/01/22 15:39:47 INFO memory.MemoryStore: Block broadcast_46_piece0 stored as bytes in memory (estimated size 17.7 KB, free 365.7 MB) 18/01/22 15:39:47 INFO storage.BlockManagerInfo: Added broadcast_46_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 17.7 KB, free: 366.1 MB) 18/01/22 15:39:47 INFO spark.SparkContext: Created broadcast 46 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:47 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 46 (MapPartitionsRDD[185] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:47 INFO scheduler.TaskSchedulerImpl: Adding task set 46.0 with 1 tasks 18/01/22 15:39:47 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 46.0 (TID 46, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:47 INFO executor.Executor: Running task 0.0 in stage 46.0 (TID 46) 18/01/22 15:39:48 INFO r.RRunner: Times: boot = 0.226 s, init = 0.750 s, broadcast = 0.000 s, read-input = 0.030 s, compute = 0.000 s, write-output = 0.120 s, total = 1.126 s 18/01/22 15:39:48 INFO executor.Executor: Finished task 0.0 in stage 46.0 (TID 46). 1423 bytes result sent to driver 18/01/22 15:39:48 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 46.0 (TID 46) in 1135 ms on localhost (executor driver) (1/1) 18/01/22 15:39:48 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 46.0, whose tasks have all completed, from pool 18/01/22 15:39:48 INFO scheduler.DAGScheduler: ResultStage 46 (treeAggregate at WeightedLeastSquares.scala:100) finished in 1.146 s 18/01/22 15:39:48 INFO scheduler.DAGScheduler: Job 36 finished: treeAggregate at WeightedLeastSquares.scala:100, took 1.158982 s 18/01/22 15:39:48 INFO optim.WeightedLeastSquares: Number of instances: 150. 18/01/22 15:39:48 INFO optim.IterativelyReweightedLeastSquares: Iteration 1 : relative tolerance = 2.8252330687650318E-5 18/01/22 15:39:48 WARN optim.WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting. 18/01/22 15:39:48 INFO spark.SparkContext: Starting job: treeAggregate at WeightedLeastSquares.scala:100 18/01/22 15:39:48 INFO scheduler.DAGScheduler: Got job 37 (treeAggregate at WeightedLeastSquares.scala:100) with 1 output partitions 18/01/22 15:39:48 INFO scheduler.DAGScheduler: Final stage: ResultStage 47 (treeAggregate at WeightedLeastSquares.scala:100) 18/01/22 15:39:48 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:48 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:48 INFO scheduler.DAGScheduler: Submitting ResultStage 47 (MapPartitionsRDD[187] at treeAggregate at WeightedLeastSquares.scala:100), which has no missing parents 18/01/22 15:39:48 INFO memory.MemoryStore: Block broadcast_47 stored as values in memory (estimated size 43.9 KB, free 365.6 MB) 18/01/22 15:39:48 INFO memory.MemoryStore: Block broadcast_47_piece0 stored as bytes in memory (estimated size 17.7 KB, free 365.6 MB) 18/01/22 15:39:48 INFO storage.BlockManagerInfo: Added broadcast_47_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 17.7 KB, free: 366.1 MB) 18/01/22 15:39:48 INFO spark.SparkContext: Created broadcast 47 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:48 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 47 (MapPartitionsRDD[187] at treeAggregate at WeightedLeastSquares.scala:100) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:48 INFO scheduler.TaskSchedulerImpl: Adding task set 47.0 with 1 tasks 18/01/22 15:39:48 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 47.0 (TID 47, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:48 INFO executor.Executor: Running task 0.0 in stage 47.0 (TID 47) 18/01/22 15:39:50 INFO r.RRunner: Times: boot = 0.439 s, init = 1.560 s, broadcast = 0.000 s, read-input = 0.030 s, compute = 0.000 s, write-output = 0.190 s, total = 2.219 s 18/01/22 15:39:50 INFO executor.Executor: Finished task 0.0 in stage 47.0 (TID 47). 1466 bytes result sent to driver 18/01/22 15:39:50 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 47.0 (TID 47) in 2239 ms on localhost (executor driver) (1/1) 18/01/22 15:39:50 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 47.0, whose tasks have all completed, from pool 18/01/22 15:39:50 INFO scheduler.DAGScheduler: ResultStage 47 (treeAggregate at WeightedLeastSquares.scala:100) finished in 2.253 s 18/01/22 15:39:50 INFO scheduler.DAGScheduler: Job 37 finished: treeAggregate at WeightedLeastSquares.scala:100, took 2.266779 s 18/01/22 15:39:50 INFO optim.WeightedLeastSquares: Number of instances: 150. 18/01/22 15:39:50 INFO optim.IterativelyReweightedLeastSquares: IRLS converged in 2 iterations. 18/01/22 15:39:50 INFO optim.IterativelyReweightedLeastSquares: Iteration 2 : relative tolerance = 1.6315129069965906E-7 18/01/22 15:39:50 INFO util.Instrumentation: GeneralizedLinearRegression-glm_6285200e2bf0-2024481417-3: training finished 18/01/22 15:39:50 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:1366 18/01/22 15:39:50 INFO scheduler.DAGScheduler: Registering RDD 190 (first at GeneralizedLinearRegression.scala:1366) 18/01/22 15:39:50 INFO scheduler.DAGScheduler: Got job 38 (first at GeneralizedLinearRegression.scala:1366) with 1 output partitions 18/01/22 15:39:50 INFO scheduler.DAGScheduler: Final stage: ResultStage 49 (first at GeneralizedLinearRegression.scala:1366) 18/01/22 15:39:50 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 48) 18/01/22 15:39:50 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 48) 18/01/22 15:39:50 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 48 (MapPartitionsRDD[190] at first at GeneralizedLinearRegression.scala:1366), which has no missing parents 18/01/22 15:39:50 INFO memory.MemoryStore: Block broadcast_48 stored as values in memory (estimated size 67.3 KB, free 365.5 MB) 18/01/22 15:39:50 INFO memory.MemoryStore: Block broadcast_48_piece0 stored as bytes in memory (estimated size 26.5 KB, free 365.5 MB) 18/01/22 15:39:50 INFO storage.BlockManagerInfo: Added broadcast_48_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 26.5 KB, free: 366.1 MB) 18/01/22 15:39:50 INFO spark.SparkContext: Created broadcast 48 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:50 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 48 (MapPartitionsRDD[190] at first at GeneralizedLinearRegression.scala:1366) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:50 INFO scheduler.TaskSchedulerImpl: Adding task set 48.0 with 1 tasks 18/01/22 15:39:50 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 48.0 (TID 48, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:50 INFO executor.Executor: Running task 0.0 in stage 48.0 (TID 48) 18/01/22 15:39:52 INFO r.RRunner: Times: boot = 0.356 s, init = 1.340 s, broadcast = 0.000 s, read-input = 0.030 s, compute = 0.000 s, write-output = 0.180 s, total = 1.906 s 18/01/22 15:39:52 INFO executor.Executor: Finished task 0.0 in stage 48.0 (TID 48). 1614 bytes result sent to driver 18/01/22 15:39:52 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 48.0 (TID 48) in 1941 ms on localhost (executor driver) (1/1) 18/01/22 15:39:52 INFO scheduler.DAGScheduler: ShuffleMapStage 48 (first at GeneralizedLinearRegression.scala:1366) finished in 1.959 s 18/01/22 15:39:52 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:52 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 48.0, whose tasks have all completed, from pool 18/01/22 15:39:52 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:52 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 49) 18/01/22 15:39:52 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Submitting ResultStage 49 (MapPartitionsRDD[194] at first at GeneralizedLinearRegression.scala:1366), which has no missing parents 18/01/22 15:39:52 INFO memory.MemoryStore: Block broadcast_49 stored as values in memory (estimated size 8.6 KB, free 365.5 MB) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1419 18/01/22 15:39:52 INFO memory.MemoryStore: Block broadcast_49_piece0 stored as bytes in memory (estimated size 4.3 KB, free 365.5 MB) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1526 18/01/22 15:39:52 INFO storage.BlockManagerInfo: Added broadcast_49_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 4.3 KB, free: 366.1 MB) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1558 18/01/22 15:39:52 INFO spark.SparkContext: Created broadcast 49 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1309 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 49 (MapPartitionsRDD[194] at first at GeneralizedLinearRegression.scala:1366) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1547 18/01/22 15:39:52 INFO scheduler.TaskSchedulerImpl: Adding task set 49.0 with 1 tasks 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1551 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1554 18/01/22 15:39:52 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 49.0 (TID 49, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1644 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1423 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1241 18/01/22 15:39:52 INFO executor.Executor: Running task 0.0 in stage 49.0 (TID 49) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1461 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1304 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1227 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1406 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1509 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1654 18/01/22 15:39:52 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1422 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1400 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1448 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1273 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1329 18/01/22 15:39:52 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 4 ms 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1525 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1356 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1624 18/01/22 15:39:52 INFO executor.Executor: Finished task 0.0 in stage 49.0 (TID 49). 1557 bytes result sent to driver 18/01/22 15:39:52 INFO storage.BlockManagerInfo: Removed broadcast_32_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 3.7 KB, free: 366.1 MB) 18/01/22 15:39:52 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 49.0 (TID 49) in 32 ms on localhost (executor driver) (1/1) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1417 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1686 18/01/22 15:39:52 INFO scheduler.DAGScheduler: ResultStage 49 (first at GeneralizedLinearRegression.scala:1366) finished in 0.072 s 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Job 38 finished: first at GeneralizedLinearRegression.scala:1366, took 2.062272 s 18/01/22 15:39:52 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 49.0, whose tasks have all completed, from pool 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1287 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1259 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1666 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1402 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1622 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1247 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1237 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1429 18/01/22 15:39:52 INFO spark.SparkContext: Starting job: count at GeneralizedLinearRegression.scala:1207 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1615 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Registering RDD 197 (count at GeneralizedLinearRegression.scala:1207) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1560 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Got job 39 (count at GeneralizedLinearRegression.scala:1207) with 1 output partitions 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1674 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Final stage: ResultStage 51 (count at GeneralizedLinearRegression.scala:1207) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1639 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 50) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1224 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 50) 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 50 (MapPartitionsRDD[197] at count at GeneralizedLinearRegression.scala:1207), which has no missing parents 18/01/22 15:39:52 INFO memory.MemoryStore: Block broadcast_50 stored as values in memory (estimated size 20.0 KB, free 365.5 MB) 18/01/22 15:39:52 INFO storage.BlockManagerInfo: Removed broadcast_31_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 8.2 KB, free: 366.1 MB) 18/01/22 15:39:52 INFO memory.MemoryStore: Block broadcast_50_piece0 stored as bytes in memory (estimated size 8.8 KB, free 365.5 MB) 18/01/22 15:39:52 INFO storage.BlockManagerInfo: Added broadcast_50_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 8.8 KB, free: 366.1 MB) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1660 18/01/22 15:39:52 INFO spark.SparkContext: Created broadcast 50 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1598 18/01/22 15:39:52 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 50 (MapPartitionsRDD[197] at count at GeneralizedLinearRegression.scala:1207) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1262 18/01/22 15:39:52 INFO scheduler.TaskSchedulerImpl: Adding task set 50.0 with 1 tasks 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1614 18/01/22 15:39:52 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 50.0 (TID 50, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1376 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1532 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1656 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1608 18/01/22 15:39:52 INFO executor.Executor: Running task 0.0 in stage 50.0 (TID 50) 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1337 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1258 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1528 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1300 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1662 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1489 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1521 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1325 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1629 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1627 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1229 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1311 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1605 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1680 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1216 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1408 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1682 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1272 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1536 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1685 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1346 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1220 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1482 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1550 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1507 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1508 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1219 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1556 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1397 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1675 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1242 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1301 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1440 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1606 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1540 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1321 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1335 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1252 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1286 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1389 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1411 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1331 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1365 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1595 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1266 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1322 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1494 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1583 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1490 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1469 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1476 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1604 18/01/22 15:39:52 INFO spark.ContextCleaner: Cleaned accumulator 1542 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1428 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1617 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1375 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1282 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1403 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1510 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1601 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1415 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1571 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1646 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1407 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1330 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1557 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1459 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1338 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1665 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1516 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1669 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1424 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1581 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1264 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1270 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_33_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 9.0 KB, free: 366.1 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1360 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1371 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1500 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1385 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1623 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1381 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1504 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1588 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1455 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1524 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1587 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1483 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1667 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1382 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1386 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1299 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1222 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1647 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1344 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1351 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1240 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1358 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1555 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_40_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 21.4 KB, free: 366.1 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1635 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1596 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1343 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1462 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1527 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1293 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1324 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1361 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1616 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1517 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1320 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1354 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1437 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1610 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1495 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1648 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1352 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1580 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1292 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1326 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1625 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1418 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1318 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1670 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1248 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1577 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1289 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1520 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1594 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1691 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1586 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1218 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1569 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1265 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1378 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1404 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_44_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 17.3 KB, free: 366.1 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1438 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1460 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1505 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1531 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1225 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1363 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1475 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1503 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1281 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1390 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1671 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1214 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1565 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1441 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1468 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1294 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1480 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1261 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1405 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1651 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1652 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1280 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1274 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1492 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1395 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1546 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1603 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1339 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1628 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1619 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1409 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1212 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1387 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1442 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1383 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1695 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1435 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_45_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 17.6 KB, free: 366.1 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1312 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1373 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1481 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1650 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1473 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1478 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1673 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1315 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1263 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1233 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1284 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1465 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1537 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1291 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1631 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1549 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1600 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1512 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1336 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1279 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1447 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1597 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1562 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1328 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1340 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1643 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1366 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1413 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1692 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1552 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1333 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1353 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1342 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1434 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1305 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1687 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1355 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1464 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1314 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1277 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1332 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1388 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1672 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1239 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1412 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1488 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1663 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1559 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1359 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1657 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1313 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1584 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1232 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1245 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1379 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1585 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1633 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1493 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1645 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1678 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1463 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1369 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1602 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1574 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1213 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1499 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_36_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 20.7 KB, free: 366.2 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1307 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1443 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1676 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1297 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1445 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1345 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1217 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1319 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1276 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_39_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 14.2 KB, free: 366.2 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1357 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1530 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1452 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1341 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1246 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1564 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1486 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1210 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1254 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1479 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1290 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1523 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1439 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1485 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_47_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 17.7 KB, free: 366.2 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1399 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1302 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1317 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1231 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1364 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1568 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1421 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1515 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1659 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1255 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1450 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1632 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1431 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1642 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1533 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1634 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1491 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1618 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1394 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1425 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1497 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1211 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1235 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1567 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1253 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1230 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned shuffle 9 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1553 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1275 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1472 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1221 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1295 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1487 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1334 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1432 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1677 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1458 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_42_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 1970.0 B, free: 366.2 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1668 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1298 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1414 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1368 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1446 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1377 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1534 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1257 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1243 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1256 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1268 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1484 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1271 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1308 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1367 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1661 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1303 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1451 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1563 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1541 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1384 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1296 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1453 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1693 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1681 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1396 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1613 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1689 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1637 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1457 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1393 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1607 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1496 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1649 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1391 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1310 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1349 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1234 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1498 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1535 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1636 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1501 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned shuffle 7 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1420 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_38_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 4.3 KB, free: 366.2 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1620 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1658 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1426 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1539 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1630 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1572 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1655 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1548 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1529 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1372 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1449 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1575 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1474 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1226 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1688 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1514 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1518 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1416 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1684 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1249 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1370 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned shuffle 8 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1513 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1593 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1694 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_35_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 18.5 KB, free: 366.2 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1223 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1306 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1578 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1347 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1573 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1544 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1398 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_34_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 4.7 KB, free: 366.2 MB) 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_37_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 8.4 KB, free: 366.2 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1679 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1471 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1582 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1430 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1470 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1626 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1444 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1576 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1238 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1545 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1327 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1690 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1228 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1316 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1599 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_43_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 14.9 KB, free: 366.2 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1538 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1348 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1467 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1285 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1374 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_46_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 17.7 KB, free: 366.3 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1653 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1664 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1566 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1323 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1380 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1250 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1436 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1612 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1466 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1561 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned shuffle 6 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1392 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1209 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1570 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1590 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1640 18/01/22 15:39:53 INFO storage.BlockManagerInfo: Removed broadcast_41_piece0 on DESKTOP-BBK0H95:53090 in memory (size: 9.7 KB, free: 366.3 MB) 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1410 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1621 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1251 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1511 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1589 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1456 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1519 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1427 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1244 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1260 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1506 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1611 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1350 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1278 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1215 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1433 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1502 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1609 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1683 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1401 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1543 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1267 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1236 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1477 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1283 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1638 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1269 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1288 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1641 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1362 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1454 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1522 18/01/22 15:39:53 INFO spark.ContextCleaner: Cleaned accumulator 1579 18/01/22 15:39:55 INFO r.RRunner: Times: boot = 0.913 s, init = 1.110 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.060 s, total = 2.103 s 18/01/22 15:39:55 INFO executor.Executor: Finished task 0.0 in stage 50.0 (TID 50). 1419 bytes result sent to driver 18/01/22 15:39:55 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 50.0 (TID 50) in 2131 ms on localhost (executor driver) (1/1) 18/01/22 15:39:55 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 50.0, whose tasks have all completed, from pool 18/01/22 15:39:55 INFO scheduler.DAGScheduler: ShuffleMapStage 50 (count at GeneralizedLinearRegression.scala:1207) finished in 2.154 s 18/01/22 15:39:55 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:55 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:55 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 51) 18/01/22 15:39:55 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Submitting ResultStage 51 (MapPartitionsRDD[200] at count at GeneralizedLinearRegression.scala:1207), which has no missing parents 18/01/22 15:39:55 INFO memory.MemoryStore: Block broadcast_51 stored as values in memory (estimated size 7.1 KB, free 366.2 MB) 18/01/22 15:39:55 INFO memory.MemoryStore: Block broadcast_51_piece0 stored as bytes in memory (estimated size 3.7 KB, free 366.2 MB) 18/01/22 15:39:55 INFO storage.BlockManagerInfo: Added broadcast_51_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 3.7 KB, free: 366.3 MB) 18/01/22 15:39:55 INFO spark.SparkContext: Created broadcast 51 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 51 (MapPartitionsRDD[200] at count at GeneralizedLinearRegression.scala:1207) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:55 INFO scheduler.TaskSchedulerImpl: Adding task set 51.0 with 1 tasks 18/01/22 15:39:55 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 51.0 (TID 51, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:55 INFO executor.Executor: Running task 0.0 in stage 51.0 (TID 51) 18/01/22 15:39:55 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:55 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 4 ms 18/01/22 15:39:55 INFO executor.Executor: Finished task 0.0 in stage 51.0 (TID 51). 1739 bytes result sent to driver 18/01/22 15:39:55 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 51.0 (TID 51) in 9 ms on localhost (executor driver) (1/1) 18/01/22 15:39:55 INFO scheduler.DAGScheduler: ResultStage 51 (count at GeneralizedLinearRegression.scala:1207) finished in 0.022 s 18/01/22 15:39:55 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 51.0, whose tasks have all completed, from pool 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Job 39 finished: count at GeneralizedLinearRegression.scala:1207, took 2.217598 s 18/01/22 15:39:55 INFO spark.SparkContext: Starting job: first at GeneralizedLinearRegression.scala:1321 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Registering RDD 203 (first at GeneralizedLinearRegression.scala:1321) 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Got job 40 (first at GeneralizedLinearRegression.scala:1321) with 1 output partitions 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Final stage: ResultStage 53 (first at GeneralizedLinearRegression.scala:1321) 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 52) 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 52) 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 52 (MapPartitionsRDD[203] at first at GeneralizedLinearRegression.scala:1321), which has no missing parents 18/01/22 15:39:55 INFO memory.MemoryStore: Block broadcast_52 stored as values in memory (estimated size 22.4 KB, free 366.1 MB) 18/01/22 15:39:55 INFO memory.MemoryStore: Block broadcast_52_piece0 stored as bytes in memory (estimated size 9.5 KB, free 366.1 MB) 18/01/22 15:39:55 INFO storage.BlockManagerInfo: Added broadcast_52_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 9.5 KB, free: 366.2 MB) 18/01/22 15:39:55 INFO spark.SparkContext: Created broadcast 52 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:55 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 52 (MapPartitionsRDD[203] at first at GeneralizedLinearRegression.scala:1321) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:55 INFO scheduler.TaskSchedulerImpl: Adding task set 52.0 with 1 tasks 18/01/22 15:39:55 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 52.0 (TID 52, localhost, executor driver, partition 0, PROCESS_LOCAL, 22310 bytes) 18/01/22 15:39:55 INFO executor.Executor: Running task 0.0 in stage 52.0 (TID 52) 18/01/22 15:39:56 INFO r.RRunner: Times: boot = 0.203 s, init = 0.720 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.080 s, total = 1.023 s 18/01/22 15:39:56 INFO executor.Executor: Finished task 0.0 in stage 52.0 (TID 52). 1462 bytes result sent to driver 18/01/22 15:39:56 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 52.0 (TID 52) in 1030 ms on localhost (executor driver) (1/1) 18/01/22 15:39:56 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 52.0, whose tasks have all completed, from pool 18/01/22 15:39:56 INFO scheduler.DAGScheduler: ShuffleMapStage 52 (first at GeneralizedLinearRegression.scala:1321) finished in 1.040 s 18/01/22 15:39:56 INFO scheduler.DAGScheduler: looking for newly runnable stages 18/01/22 15:39:56 INFO scheduler.DAGScheduler: running: Set() 18/01/22 15:39:56 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 53) 18/01/22 15:39:56 INFO scheduler.DAGScheduler: failed: Set() 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Submitting ResultStage 53 (MapPartitionsRDD[207] at first at GeneralizedLinearRegression.scala:1321), which has no missing parents 18/01/22 15:39:56 INFO memory.MemoryStore: Block broadcast_53 stored as values in memory (estimated size 9.9 KB, free 366.1 MB) 18/01/22 15:39:56 INFO memory.MemoryStore: Block broadcast_53_piece0 stored as bytes in memory (estimated size 4.7 KB, free 366.1 MB) 18/01/22 15:39:56 INFO storage.BlockManagerInfo: Added broadcast_53_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 4.7 KB, free: 366.2 MB) 18/01/22 15:39:56 INFO spark.SparkContext: Created broadcast 53 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 53 (MapPartitionsRDD[207] at first at GeneralizedLinearRegression.scala:1321) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:56 INFO scheduler.TaskSchedulerImpl: Adding task set 53.0 with 1 tasks 18/01/22 15:39:56 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 53.0 (TID 53, localhost, executor driver, partition 0, ANY, 7754 bytes) 18/01/22 15:39:56 INFO executor.Executor: Running task 0.0 in stage 53.0 (TID 53) 18/01/22 15:39:56 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks 18/01/22 15:39:56 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 3 ms 18/01/22 15:39:56 INFO executor.Executor: Finished task 0.0 in stage 53.0 (TID 53). 1523 bytes result sent to driver 18/01/22 15:39:56 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 53.0 (TID 53) in 11 ms on localhost (executor driver) (1/1) 18/01/22 15:39:56 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 53.0, whose tasks have all completed, from pool 18/01/22 15:39:56 INFO scheduler.DAGScheduler: ResultStage 53 (first at GeneralizedLinearRegression.scala:1321) finished in 0.020 s 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Job 40 finished: first at GeneralizedLinearRegression.scala:1321, took 1.072145 s 18/01/22 15:39:56 INFO spark.SparkContext: Starting job: sum at GeneralizedLinearRegression.scala:1340 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Got job 41 (sum at GeneralizedLinearRegression.scala:1340) with 1 output partitions 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Final stage: ResultStage 54 (sum at GeneralizedLinearRegression.scala:1340) 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Submitting ResultStage 54 (MapPartitionsRDD[212] at map at GeneralizedLinearRegression.scala:1337), which has no missing parents 18/01/22 15:39:56 INFO memory.MemoryStore: Block broadcast_54 stored as values in memory (estimated size 55.3 KB, free 366.1 MB) 18/01/22 15:39:56 INFO memory.MemoryStore: Block broadcast_54_piece0 stored as bytes in memory (estimated size 22.5 KB, free 366.0 MB) 18/01/22 15:39:56 INFO storage.BlockManagerInfo: Added broadcast_54_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 22.5 KB, free: 366.2 MB) 18/01/22 15:39:56 INFO spark.SparkContext: Created broadcast 54 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:56 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 54 (MapPartitionsRDD[212] at map at GeneralizedLinearRegression.scala:1337) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:56 INFO scheduler.TaskSchedulerImpl: Adding task set 54.0 with 1 tasks 18/01/22 15:39:56 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 54.0 (TID 54, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:56 INFO executor.Executor: Running task 0.0 in stage 54.0 (TID 54) 18/01/22 15:39:57 INFO r.RRunner: Times: boot = 0.209 s, init = 0.750 s, broadcast = 0.000 s, read-input = 0.000 s, compute = 0.000 s, write-output = 0.080 s, total = 1.039 s 18/01/22 15:39:57 INFO executor.Executor: Finished task 0.0 in stage 54.0 (TID 54). 1213 bytes result sent to driver 18/01/22 15:39:57 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 54.0 (TID 54) in 1037 ms on localhost (executor driver) (1/1) 18/01/22 15:39:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 54.0, whose tasks have all completed, from pool 18/01/22 15:39:57 INFO scheduler.DAGScheduler: ResultStage 54 (sum at GeneralizedLinearRegression.scala:1340) finished in 1.048 s 18/01/22 15:39:57 INFO scheduler.DAGScheduler: Job 41 finished: sum at GeneralizedLinearRegression.scala:1340, took 1.056593 s 18/01/22 15:39:57 INFO spark.SparkContext: Starting job: sum at GeneralizedLinearRegression.scala:1351 18/01/22 15:39:57 INFO scheduler.DAGScheduler: Got job 42 (sum at GeneralizedLinearRegression.scala:1351) with 1 output partitions 18/01/22 15:39:57 INFO scheduler.DAGScheduler: Final stage: ResultStage 55 (sum at GeneralizedLinearRegression.scala:1351) 18/01/22 15:39:57 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:57 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:57 INFO scheduler.DAGScheduler: Submitting ResultStage 55 (MapPartitionsRDD[217] at map at GeneralizedLinearRegression.scala:1348), which has no missing parents 18/01/22 15:39:57 INFO memory.MemoryStore: Block broadcast_55 stored as values in memory (estimated size 64.6 KB, free 366.0 MB) 18/01/22 15:39:57 INFO memory.MemoryStore: Block broadcast_55_piece0 stored as bytes in memory (estimated size 25.3 KB, free 365.9 MB) 18/01/22 15:39:57 INFO storage.BlockManagerInfo: Added broadcast_55_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 25.3 KB, free: 366.2 MB) 18/01/22 15:39:57 INFO spark.SparkContext: Created broadcast 55 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:57 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 55 (MapPartitionsRDD[217] at map at GeneralizedLinearRegression.scala:1348) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:57 INFO scheduler.TaskSchedulerImpl: Adding task set 55.0 with 1 tasks 18/01/22 15:39:57 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 55.0 (TID 55, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:57 INFO executor.Executor: Running task 0.0 in stage 55.0 (TID 55) 18/01/22 15:39:58 INFO r.RRunner: Times: boot = 0.216 s, init = 0.720 s, broadcast = 0.010 s, read-input = 0.000 s, compute = 0.000 s, write-output = 0.080 s, total = 1.026 s 18/01/22 15:39:58 INFO executor.Executor: Finished task 0.0 in stage 55.0 (TID 55). 1256 bytes result sent to driver 18/01/22 15:39:58 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 55.0 (TID 55) in 1029 ms on localhost (executor driver) (1/1) 18/01/22 15:39:58 INFO scheduler.DAGScheduler: ResultStage 55 (sum at GeneralizedLinearRegression.scala:1351) finished in 1.041 s 18/01/22 15:39:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 55.0, whose tasks have all completed, from pool 18/01/22 15:39:58 INFO scheduler.DAGScheduler: Job 42 finished: sum at GeneralizedLinearRegression.scala:1351, took 1.047123 s 18/01/22 15:39:58 INFO spark.SparkContext: Starting job: dfToCols at NativeMethodAccessorImpl.java:0 18/01/22 15:39:58 INFO scheduler.DAGScheduler: Got job 43 (dfToCols at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:58 INFO scheduler.DAGScheduler: Final stage: ResultStage 56 (dfToCols at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:58 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:58 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:58 INFO scheduler.DAGScheduler: Submitting ResultStage 56 (MapPartitionsRDD[221] at dfToCols at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:39:58 INFO memory.MemoryStore: Block broadcast_56 stored as values in memory (estimated size 65.1 KB, free 365.9 MB) 18/01/22 15:39:58 INFO memory.MemoryStore: Block broadcast_56_piece0 stored as bytes in memory (estimated size 25.3 KB, free 365.9 MB) 18/01/22 15:39:58 INFO storage.BlockManagerInfo: Added broadcast_56_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 25.3 KB, free: 366.2 MB) 18/01/22 15:39:58 INFO spark.SparkContext: Created broadcast 56 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 56 (MapPartitionsRDD[221] at dfToCols at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:58 INFO scheduler.TaskSchedulerImpl: Adding task set 56.0 with 1 tasks 18/01/22 15:39:58 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 56.0 (TID 56, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:58 INFO executor.Executor: Running task 0.0 in stage 56.0 (TID 56) 18/01/22 15:39:59 INFO executor.Executor: Finished task 0.0 in stage 56.0 (TID 56). 1176 bytes result sent to driver 18/01/22 15:39:59 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 56.0 (TID 56) in 991 ms on localhost (executor driver) (1/1) 18/01/22 15:39:59 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 56.0, whose tasks have all completed, from pool 18/01/22 15:39:59 INFO scheduler.DAGScheduler: ResultStage 56 (dfToCols at NativeMethodAccessorImpl.java:0) finished in 0.999 s 18/01/22 15:39:59 INFO scheduler.DAGScheduler: Job 43 finished: dfToCols at NativeMethodAccessorImpl.java:0, took 1.005275 s 18/01/22 15:39:59 INFO spark.SparkContext: Starting job: dfToCols at NativeMethodAccessorImpl.java:0 18/01/22 15:39:59 INFO scheduler.DAGScheduler: Got job 44 (dfToCols at NativeMethodAccessorImpl.java:0) with 1 output partitions 18/01/22 15:39:59 INFO scheduler.DAGScheduler: Final stage: ResultStage 57 (dfToCols at NativeMethodAccessorImpl.java:0) 18/01/22 15:39:59 INFO scheduler.DAGScheduler: Parents of final stage: List() 18/01/22 15:39:59 INFO scheduler.DAGScheduler: Missing parents: List() 18/01/22 15:39:59 INFO scheduler.DAGScheduler: Submitting ResultStage 57 (MapPartitionsRDD[224] at dfToCols at NativeMethodAccessorImpl.java:0), which has no missing parents 18/01/22 15:39:59 INFO memory.MemoryStore: Block broadcast_57 stored as values in memory (estimated size 64.5 KB, free 365.8 MB) 18/01/22 15:39:59 INFO memory.MemoryStore: Block broadcast_57_piece0 stored as bytes in memory (estimated size 25.2 KB, free 365.8 MB) 18/01/22 15:39:59 INFO storage.BlockManagerInfo: Added broadcast_57_piece0 in memory on DESKTOP-BBK0H95:53090 (size: 25.2 KB, free: 366.1 MB) 18/01/22 15:39:59 INFO spark.SparkContext: Created broadcast 57 from broadcast at DAGScheduler.scala:1029 18/01/22 15:39:59 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 57 (MapPartitionsRDD[224] at dfToCols at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) 18/01/22 15:39:59 INFO scheduler.TaskSchedulerImpl: Adding task set 57.0 with 1 tasks 18/01/22 15:39:59 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 57.0 (TID 57, localhost, executor driver, partition 0, PROCESS_LOCAL, 22321 bytes) 18/01/22 15:39:59 INFO executor.Executor: Running task 0.0 in stage 57.0 (TID 57) 18/01/22 15:40:00 INFO r.RRunner: Times: boot = 0.226 s, init = 0.750 s, broadcast = 0.000 s, read-input = 0.020 s, compute = 0.000 s, write-output = 0.060 s, total = 1.056 s 18/01/22 15:40:00 INFO executor.Executor: Finished task 0.0 in stage 57.0 (TID 57). 2304 bytes result sent to driver 18/01/22 15:40:00 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 57.0 (TID 57) in 1067 ms on localhost (executor driver) (1/1) 18/01/22 15:40:00 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 57.0, whose tasks have all completed, from pool 18/01/22 15:40:00 INFO scheduler.DAGScheduler: ResultStage 57 (dfToCols at NativeMethodAccessorImpl.java:0) finished in 1.081 s 18/01/22 15:40:00 INFO scheduler.DAGScheduler: Job 44 finished: dfToCols at NativeMethodAccessorImpl.java:0, took 1.087110 s 18/01/22 15:40:00 INFO server.AbstractConnector: Stopped Spark@560332bf{HTTP/1.1,[http/1.1]}{0.0.0.0:4041} 18/01/22 15:40:00 INFO ui.SparkUI: Stopped Spark web UI at http://DESKTOP-BBK0H95:4041 18/01/22 15:40:00 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 18/01/22 15:40:00 INFO memory.MemoryStore: MemoryStore cleared 18/01/22 15:40:00 INFO storage.BlockManager: BlockManager stopped 18/01/22 15:40:00 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 18/01/22 15:40:00 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 18/01/22 15:40:00 INFO spark.SparkContext: Successfully stopped SparkContext == testthat results =========================================================== OK: 15 SKIPPED: 0 FAILED: 0 18/01/22 15:40:05 INFO util.ShutdownHookManager: Shutdown hook called 18/01/22 15:40:05 INFO util.ShutdownHookManager: Deleting directory C:\Users\nam23\AppData\Local\Temp\spark-6e66f881-4d3e-406e-ac32-346ed40bd635 18/01/22 15:40:05 INFO util.ShutdownHookManager: Deleting directory C:\Users\nam23\AppData\Local\Temp\spark-062c7e1c-410c-4115-9bd0-a5f1ed9f3c90 C:\programs\spark-dev>