Logs for container_e54_1512469367986_4908_01_000007
ResourceManager
RM Home
NodeManager
Tools
Picked up JAVA_TOOL_OPTIONS: -Dfile.encoding=UTF-8
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/var/lib/hadoop/data/g/yarn/local/usercache/ebernhardson/filecache/492/__spark_libs__5186771349499828915.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/flume-ng/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/12/07 04:23:20 INFO CoarseGrainedExecutorBackend: Started daemon with process name: 25933@analytics1049
17/12/07 04:23:20 INFO SignalUtils: Registered signal handler for TERM
17/12/07 04:23:20 INFO SignalUtils: Registered signal handler for HUP
17/12/07 04:23:20 INFO SignalUtils: Registered signal handler for INT
17/12/07 04:23:20 INFO SecurityManager: Changing view acls to: yarn,ebernhardson
17/12/07 04:23:20 INFO SecurityManager: Changing modify acls to: yarn,ebernhardson
17/12/07 04:23:20 INFO SecurityManager: Changing view acls groups to:
17/12/07 04:23:20 INFO SecurityManager: Changing modify acls groups to:
17/12/07 04:23:20 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, ebernhardson); groups with view permissions: Set(); users with modify permissions: Set(yarn, ebernhardson); groups with modify permissions: Set()
17/12/07 04:23:21 INFO TransportClientFactory: Successfully created connection to /10.64.53.30:40931 after 103 ms (0 ms spent in bootstraps)
17/12/07 04:23:21 WARN SparkConf: The configuration key 'spark.yarn.jar' has been deprecated as of Spark 2.0 and may be removed in the future. Please use the new key 'spark.yarn.jars' instead.
17/12/07 04:23:21 INFO SecurityManager: Changing view acls to: yarn,ebernhardson
17/12/07 04:23:21 INFO SecurityManager: Changing modify acls to: yarn,ebernhardson
17/12/07 04:23:21 INFO SecurityManager: Changing view acls groups to:
17/12/07 04:23:21 INFO SecurityManager: Changing modify acls groups to:
17/12/07 04:23:21 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, ebernhardson); groups with view permissions: Set(); users with modify permissions: Set(yarn, ebernhardson); groups with modify permissions: Set()
17/12/07 04:23:21 INFO TransportClientFactory: Successfully created connection to /10.64.53.30:40931 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/b/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-fc46b9ad-2511-4261-88a5-d2dd2aa58fd7
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/c/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-31afa154-6155-4a5c-97ce-05e7b466aacf
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/d/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-27f1a9e3-a92a-4f9b-a2d5-22c7ebf44fce
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/e/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-c6164a5a-d8e0-4691-b13a-84ff0bb75a82
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/f/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-7f119622-3e17-4d68-89ff-0bee0156f0a0
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/g/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-06648fcf-99bc-4ef3-a21c-012d74e2d514
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/h/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-451a4fe0-87ff-49d3-8872-277242971db9
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/i/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-b1851e36-6ebe-4691-849c-d1be6ba5823c
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/j/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-20bd9f76-b393-4bf7-bbef-8e3ac57c0e93
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/k/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-a3f58022-adc9-49f0-b07e-3ee72e944344
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/l/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-730c13db-1af3-465a-9ff2-7f5b250d9cf3
17/12/07 04:23:21 INFO DiskBlockManager: Created local directory at /var/lib/hadoop/data/m/yarn/local/usercache/ebernhardson/appcache/application_1512469367986_4908/blockmgr-76e61f85-b535-4ba2-ab47-7d4b64fa48c4
17/12/07 04:23:21 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
17/12/07 04:23:21 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@10.64.53.30:40931
17/12/07 04:23:21 INFO CoarseGrainedExecutorBackend: Successfully registered with driver
17/12/07 04:23:22 INFO Executor: Starting executor ID 6 on host analytics1049.eqiad.wmnet
17/12/07 04:23:22 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46181.
17/12/07 04:23:22 INFO NettyBlockTransferService: Server created on analytics1049.eqiad.wmnet:46181
17/12/07 04:23:22 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/12/07 04:23:22 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(6, analytics1049.eqiad.wmnet, 46181, None)
17/12/07 04:23:22 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(6, analytics1049.eqiad.wmnet, 46181, None)
17/12/07 04:23:22 INFO BlockManager: external shuffle service port = 7337
17/12/07 04:23:22 INFO BlockManager: Registering executor with local external shuffle service.
17/12/07 04:23:22 INFO TransportClientFactory: Successfully created connection to analytics1049.eqiad.wmnet/10.64.21.108:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:22 INFO BlockManager: Initialized BlockManager: BlockManagerId(6, analytics1049.eqiad.wmnet, 46181, None)
17/12/07 04:23:30 INFO CoarseGrainedExecutorBackend: Got assigned task 161
17/12/07 04:23:30 INFO Executor: Running task 23.0 in stage 5.0 (TID 161)
17/12/07 04:23:30 INFO MapOutputTrackerWorker: Updating epoch to 2 and clearing cache
17/12/07 04:23:30 INFO TorrentBroadcast: Started reading broadcast variable 8
17/12/07 04:23:30 INFO TransportClientFactory: Successfully created connection to analytics1052.eqiad.wmnet/10.64.5.15:37429 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:30 INFO MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 11.8 KB, free 2004.6 MB)
17/12/07 04:23:31 INFO TorrentBroadcast: Reading broadcast variable 8 took 158 ms
17/12/07 04:23:31 INFO MemoryStore: Block broadcast_8 stored as values in memory (estimated size 26.9 KB, free 2004.6 MB)
17/12/07 04:23:31 INFO CodeGenerator: Code generated in 448.432572 ms
17/12/07 04:23:31 INFO CodeGenerator: Code generated in 12.224754 ms
17/12/07 04:23:31 INFO CodeGenerator: Code generated in 12.898548 ms
17/12/07 04:23:31 INFO CodeGenerator: Code generated in 13.493018 ms
17/12/07 04:23:31 INFO CodeGenerator: Code generated in 16.834078 ms
17/12/07 04:23:31 INFO CodeGenerator: Code generated in 12.784617 ms
17/12/07 04:23:32 INFO TorrentBroadcast: Started reading broadcast variable 7
17/12/07 04:23:32 INFO TransportClientFactory: Successfully created connection to analytics1064.eqiad.wmnet/10.64.36.104:33993 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:32 INFO MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 28.3 KB, free 1988.3 MB)
17/12/07 04:23:32 INFO TorrentBroadcast: Reading broadcast variable 7 took 277 ms
17/12/07 04:23:32 INFO MemoryStore: Block broadcast_7 stored as values in memory (estimated size 381.7 KB, free 1987.9 MB)
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
17/12/07 04:23:33 INFO CodecPool: Got brand-new decompressor [.snappy]
17/12/07 04:23:34 INFO Executor: Finished task 23.0 in stage 5.0 (TID 161). 3586 bytes result sent to driver
17/12/07 04:23:35 INFO CoarseGrainedExecutorBackend: Got assigned task 565
17/12/07 04:23:35 INFO Executor: Running task 28.0 in stage 6.0 (TID 565)
17/12/07 04:23:35 INFO MapOutputTrackerWorker: Updating epoch to 3 and clearing cache
17/12/07 04:23:35 INFO TorrentBroadcast: Started reading broadcast variable 9
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1043.eqiad.wmnet/10.64.53.23:37507 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 10.9 KB, free 2004.2 MB)
17/12/07 04:23:35 INFO TorrentBroadcast: Reading broadcast variable 9 took 37 ms
17/12/07 04:23:35 INFO MemoryStore: Block broadcast_9 stored as values in memory (estimated size 23.0 KB, free 2004.1 MB)
17/12/07 04:23:35 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 2, fetching them
17/12/07 04:23:35 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://MapOutputTracker@10.64.53.30:40931)
17/12/07 04:23:35 INFO MapOutputTrackerWorker: Got the output locations
17/12/07 04:23:35 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1032.eqiad.wmnet/10.64.36.132:7337 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1031.eqiad.wmnet/10.64.36.131:7337 after 4 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1030.eqiad.wmnet/10.64.36.130:7337 after 3 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1064.eqiad.wmnet/10.64.36.104:7337 after 19 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1029.eqiad.wmnet/10.64.36.129:7337 after 14 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1046.eqiad.wmnet/10.64.21.105:7337 after 13 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1055.eqiad.wmnet/10.64.5.18:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1050.eqiad.wmnet/10.64.21.111:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1040.eqiad.wmnet/10.64.53.19:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1051.eqiad.wmnet/10.64.21.112:7337 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1039.eqiad.wmnet/10.64.53.18:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1066.eqiad.wmnet/10.64.36.106:7337 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1033.eqiad.wmnet/10.64.36.133:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1062.eqiad.wmnet/10.64.21.114:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1038.eqiad.wmnet/10.64.53.17:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1054.eqiad.wmnet/10.64.5.17:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1059.eqiad.wmnet/10.64.5.22:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1047.eqiad.wmnet/10.64.21.106:7337 after 0 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1044.eqiad.wmnet/10.64.53.24:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1060.eqiad.wmnet/10.64.5.23:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1048.eqiad.wmnet/10.64.21.107:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1036.eqiad.wmnet/10.64.53.15:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1028.eqiad.wmnet/10.64.36.128:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1049.eqiad.wmnet/10.64.21.108:7337 after 0 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1052.eqiad.wmnet/10.64.5.15:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1067.eqiad.wmnet/10.64.53.27:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1061.eqiad.wmnet/10.64.21.113:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1037.eqiad.wmnet/10.64.53.16:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1068.eqiad.wmnet/10.64.53.28:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1065.eqiad.wmnet/10.64.36.105:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1045.eqiad.wmnet/10.64.53.25:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1043.eqiad.wmnet/10.64.53.23:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1063.eqiad.wmnet/10.64.21.115:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1057.eqiad.wmnet/10.64.5.20:7337 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1056.eqiad.wmnet/10.64.5.19:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1042.eqiad.wmnet/10.64.53.22:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1053.eqiad.wmnet/10.64.5.16:7337 after 4 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1034.eqiad.wmnet/10.64.36.134:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1058.eqiad.wmnet/10.64.5.21:7337 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1041.eqiad.wmnet/10.64.53.20:7337 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1035.eqiad.wmnet/10.64.53.14:7337 after 0 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO TransportClientFactory: Successfully created connection to analytics1069.eqiad.wmnet/10.64.53.29:7337 after 0 ms (0 ms spent in bootstraps)
17/12/07 04:23:35 INFO ShuffleBlockFetcherIterator: Started 58 remote fetches in 230 ms
17/12/07 04:23:35 INFO CodeGenerator: Code generated in 48.723522 ms
17/12/07 04:23:36 INFO Executor: Finished task 28.0 in stage 6.0 (TID 565). 17275 bytes result sent to driver
17/12/07 04:23:36 INFO CoarseGrainedExecutorBackend: Got assigned task 632
17/12/07 04:23:36 INFO Executor: Running task 95.0 in stage 6.0 (TID 632)
17/12/07 04:23:36 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:23:36 INFO ShuffleBlockFetcherIterator: Started 58 remote fetches in 51 ms
17/12/07 04:23:36 INFO Executor: Finished task 95.0 in stage 6.0 (TID 632). 17313 bytes result sent to driver
17/12/07 04:23:36 INFO CoarseGrainedExecutorBackend: Got assigned task 710
17/12/07 04:23:36 INFO Executor: Running task 173.0 in stage 6.0 (TID 710)
17/12/07 04:23:36 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:23:36 INFO ShuffleBlockFetcherIterator: Started 58 remote fetches in 55 ms
17/12/07 04:23:36 INFO Executor: Finished task 173.0 in stage 6.0 (TID 710). 17114 bytes result sent to driver
17/12/07 04:23:45 INFO CoarseGrainedExecutorBackend: Got assigned task 799
17/12/07 04:23:45 INFO Executor: Running task 61.0 in stage 8.0 (TID 799)
17/12/07 04:23:45 INFO TorrentBroadcast: Started reading broadcast variable 14
17/12/07 04:23:45 INFO TransportClientFactory: Successfully created connection to analytics1059.eqiad.wmnet/10.64.5.22:45183 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:46 INFO MemoryStore: Block broadcast_14_piece0 stored as bytes in memory (estimated size 9.4 KB, free 2004.2 MB)
17/12/07 04:23:46 INFO TorrentBroadcast: Reading broadcast variable 14 took 59 ms
17/12/07 04:23:46 INFO MemoryStore: Block broadcast_14 stored as values in memory (estimated size 18.1 KB, free 2004.2 MB)
17/12/07 04:23:46 INFO CodeGenerator: Code generated in 15.857671 ms
17/12/07 04:23:46 INFO CodeGenerator: Code generated in 22.301075 ms
17/12/07 04:23:46 INFO PythonRunner: Times: total = 710, boot = 682, init = 24, finish = 4
17/12/07 04:23:47 INFO Executor: Finished task 61.0 in stage 8.0 (TID 799). 3141 bytes result sent to driver
17/12/07 04:23:47 INFO CoarseGrainedExecutorBackend: Got assigned task 856
17/12/07 04:23:47 INFO Executor: Running task 118.0 in stage 8.0 (TID 856)
17/12/07 04:23:47 INFO PythonRunner: Times: total = 50, boot = -214, init = 260, finish = 4
17/12/07 04:23:47 INFO Executor: Finished task 118.0 in stage 8.0 (TID 856). 2553 bytes result sent to driver
17/12/07 04:23:47 INFO CoarseGrainedExecutorBackend: Got assigned task 938
17/12/07 04:23:47 INFO Executor: Running task 200.0 in stage 8.0 (TID 938)
17/12/07 04:23:47 INFO PythonRunner: Times: total = 49, boot = -168, init = 213, finish = 4
17/12/07 04:23:47 INFO Executor: Finished task 200.0 in stage 8.0 (TID 938). 2553 bytes result sent to driver
17/12/07 04:23:47 INFO CoarseGrainedExecutorBackend: Got assigned task 1007
17/12/07 04:23:47 INFO Executor: Running task 269.0 in stage 8.0 (TID 1007)
17/12/07 04:23:47 INFO PythonRunner: Times: total = 47, boot = -106, init = 149, finish = 4
17/12/07 04:23:47 INFO Executor: Finished task 269.0 in stage 8.0 (TID 1007). 2553 bytes result sent to driver
17/12/07 04:23:47 INFO CoarseGrainedExecutorBackend: Got assigned task 1082
17/12/07 04:23:47 INFO Executor: Running task 344.0 in stage 8.0 (TID 1082)
17/12/07 04:23:47 INFO PythonRunner: Times: total = 50, boot = -105, init = 151, finish = 4
17/12/07 04:23:47 INFO Executor: Finished task 344.0 in stage 8.0 (TID 1082). 2553 bytes result sent to driver
17/12/07 04:23:47 INFO CoarseGrainedExecutorBackend: Got assigned task 1141
17/12/07 04:23:47 INFO Executor: Running task 23.0 in stage 9.0 (TID 1141)
17/12/07 04:23:47 INFO TorrentBroadcast: Started reading broadcast variable 15
17/12/07 04:23:47 INFO TransportClientFactory: Successfully created connection to analytics1035.eqiad.wmnet/10.64.53.14:38229 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:47 INFO MemoryStore: Block broadcast_15_piece0 stored as bytes in memory (estimated size 7.6 KB, free 2004.2 MB)
17/12/07 04:23:47 INFO TorrentBroadcast: Reading broadcast variable 15 took 81 ms
17/12/07 04:23:47 INFO MemoryStore: Block broadcast_15 stored as values in memory (estimated size 18.1 KB, free 2004.1 MB)
17/12/07 04:23:47 INFO CodeGenerator: Code generated in 53.440243 ms
17/12/07 04:23:47 INFO CodeGenerator: Code generated in 22.074297 ms
17/12/07 04:23:47 INFO TorrentBroadcast: Started reading broadcast variable 13
17/12/07 04:23:47 INFO TransportClientFactory: Successfully created connection to analytics1066.eqiad.wmnet/10.64.36.106:41389 after 2 ms (0 ms spent in bootstraps)
17/12/07 04:23:47 INFO MemoryStore: Block broadcast_13_piece0 stored as bytes in memory (estimated size 29.3 KB, free 2004.1 MB)
17/12/07 04:23:47 INFO TorrentBroadcast: Reading broadcast variable 13 took 54 ms
17/12/07 04:23:48 INFO MemoryStore: Block broadcast_13 stored as values in memory (estimated size 381.7 KB, free 2003.7 MB)
17/12/07 04:23:48 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:48 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:23:48 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:48 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:23:50 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:50 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:23:50 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:50 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:23:52 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:52 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:23:52 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:52 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:23:53 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:53 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:23:53 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:53 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:23:53 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:53 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:23:53 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:53 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:23:53 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 0 records.
17/12/07 04:23:53 INFO Executor: Finished task 310.0 in stage 9.0 (TID 1476). 1733 bytes result sent to driver
17/12/07 04:23:53 INFO CoarseGrainedExecutorBackend: Got assigned task 1487
17/12/07 04:23:53 INFO Executor: Running task 43.0 in stage 12.0 (TID 1487)
17/12/07 04:23:53 INFO TorrentBroadcast: Started reading broadcast variable 16
17/12/07 04:23:53 INFO TransportClientFactory: Successfully created connection to analytics1046.eqiad.wmnet/10.64.21.105:46169 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:53 INFO MemoryStore: Block broadcast_16_piece0 stored as bytes in memory (estimated size 7.6 KB, free 2003.7 MB)
17/12/07 04:23:53 INFO TorrentBroadcast: Reading broadcast variable 16 took 11 ms
17/12/07 04:23:53 INFO MemoryStore: Block broadcast_16 stored as values in memory (estimated size 18.1 KB, free 2003.7 MB)
17/12/07 04:23:53 INFO TorrentBroadcast: Started reading broadcast variable 11
17/12/07 04:23:53 INFO TransportClientFactory: Successfully created connection to analytics1030.eqiad.wmnet/10.64.36.130:41203 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:53 INFO MemoryStore: Block broadcast_11_piece0 stored as bytes in memory (estimated size 29.3 KB, free 2003.7 MB)
17/12/07 04:23:53 INFO TorrentBroadcast: Reading broadcast variable 11 took 12 ms
17/12/07 04:23:53 INFO MemoryStore: Block broadcast_11 stored as values in memory (estimated size 381.7 KB, free 2003.3 MB)
17/12/07 04:23:53 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:53 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:23:53 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:53 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:23:55 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:55 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:23:55 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:55 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:23:56 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:56 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:23:56 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:23:56 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:23:56 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 452514 records.
17/12/07 04:23:56 INFO InternalParquetRecordReader: at row 0. reading next block
17/12/07 04:23:56 INFO InternalParquetRecordReader: block read in memory in 56 ms. row count = 452514
17/12/07 04:23:57 INFO Executor: Finished task 143.0 in stage 12.0 (TID 1651). 2083 bytes result sent to driver
17/12/07 04:23:57 INFO CoarseGrainedExecutorBackend: Got assigned task 1712
17/12/07 04:23:57 INFO Executor: Running task 60.0 in stage 10.0 (TID 1712)
17/12/07 04:23:57 INFO MapOutputTrackerWorker: Updating epoch to 5 and clearing cache
17/12/07 04:23:57 INFO TorrentBroadcast: Started reading broadcast variable 20
17/12/07 04:23:57 INFO TransportClientFactory: Successfully created connection to analytics1049.eqiad.wmnet/10.64.21.108:33693 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:23:57 INFO MemoryStore: Block broadcast_20_piece0 stored as bytes in memory (estimated size 18.4 KB, free 2003.3 MB)
17/12/07 04:23:57 INFO TorrentBroadcast: Reading broadcast variable 20 took 26 ms
17/12/07 04:23:57 INFO MemoryStore: Block broadcast_20 stored as values in memory (estimated size 41.3 KB, free 2003.3 MB)
17/12/07 04:23:58 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 9, fetching them
17/12/07 04:23:58 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://MapOutputTracker@10.64.53.30:40931)
17/12/07 04:23:58 INFO MapOutputTrackerWorker: Got the output locations
17/12/07 04:23:58 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:23:58 INFO ShuffleBlockFetcherIterator: Started 64 remote fetches in 24 ms
17/12/07 04:23:58 INFO CodeGenerator: Code generated in 21.091631 ms
17/12/07 04:23:58 INFO CodeGenerator: Code generated in 14.87779 ms
17/12/07 04:23:58 INFO CodeGenerator: Code generated in 18.717434 ms
17/12/07 04:23:58 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 10, fetching them
17/12/07 04:23:58 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://MapOutputTracker@10.64.53.30:40931)
17/12/07 04:23:58 INFO MapOutputTrackerWorker: Got the output locations
17/12/07 04:23:58 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 378 blocks
17/12/07 04:23:58 INFO ShuffleBlockFetcherIterator: Started 2 remote fetches in 2 ms
17/12/07 04:23:58 INFO CodeGenerator: Code generated in 20.455553 ms
17/12/07 04:23:58 INFO CodeGenerator: Code generated in 59.217063 ms
17/12/07 04:23:58 INFO CodeGenerator: Code generated in 17.806493 ms
17/12/07 04:23:59 INFO Executor: Finished task 60.0 in stage 10.0 (TID 1712). 4785 bytes result sent to driver
17/12/07 04:23:59 INFO CoarseGrainedExecutorBackend: Got assigned task 1806
17/12/07 04:23:59 INFO Executor: Running task 142.0 in stage 10.0 (TID 1806)
17/12/07 04:23:59 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:23:59 INFO ShuffleBlockFetcherIterator: Started 64 remote fetches in 29 ms
17/12/07 04:23:59 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 378 blocks
17/12/07 04:23:59 INFO ShuffleBlockFetcherIterator: Started 2 remote fetches in 1 ms
17/12/07 04:24:00 INFO Executor: Finished task 142.0 in stage 10.0 (TID 1806). 4197 bytes result sent to driver
17/12/07 04:24:00 INFO CoarseGrainedExecutorBackend: Got assigned task 1863
17/12/07 04:24:00 INFO Executor: Running task 181.0 in stage 10.0 (TID 1863)
17/12/07 04:24:00 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:24:00 INFO ShuffleBlockFetcherIterator: Started 64 remote fetches in 33 ms
17/12/07 04:24:00 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 378 blocks
17/12/07 04:24:00 INFO ShuffleBlockFetcherIterator: Started 2 remote fetches in 1 ms
17/12/07 04:24:00 INFO Executor: Finished task 181.0 in stage 10.0 (TID 1863). 4284 bytes result sent to driver
17/12/07 04:24:00 INFO CoarseGrainedExecutorBackend: Got assigned task 2033
17/12/07 04:24:00 INFO Executor: Running task 203.0 in stage 12.0 (TID 2033)
17/12/07 04:24:00 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:00 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:00 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:00 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:01 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:01 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:01 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 0 records.
17/12/07 04:24:01 INFO Executor: Finished task 369.0 in stage 12.0 (TID 2101). 1733 bytes result sent to driver
17/12/07 04:24:01 INFO CoarseGrainedExecutorBackend: Got assigned task 2108
17/12/07 04:24:01 INFO Executor: Running task 63.0 in stage 13.0 (TID 2108)
17/12/07 04:24:01 INFO TorrentBroadcast: Started reading broadcast variable 17
17/12/07 04:24:01 INFO TransportClientFactory: Successfully created connection to analytics1048.eqiad.wmnet/10.64.21.107:36753 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:24:01 INFO MemoryStore: Block broadcast_17_piece0 stored as bytes in memory (estimated size 9.4 KB, free 2003.3 MB)
17/12/07 04:24:01 INFO TorrentBroadcast: Reading broadcast variable 17 took 27 ms
17/12/07 04:24:01 INFO MemoryStore: Block broadcast_17 stored as values in memory (estimated size 18.1 KB, free 2003.2 MB)
17/12/07 04:24:01 INFO CodeGenerator: Code generated in 21.929687 ms
17/12/07 04:24:01 INFO PythonRunner: Times: total = 48, boot = -13673, init = 13717, finish = 4
17/12/07 04:24:01 INFO Executor: Finished task 63.0 in stage 13.0 (TID 2108). 3068 bytes result sent to driver
17/12/07 04:24:01 INFO CoarseGrainedExecutorBackend: Got assigned task 2150
17/12/07 04:24:01 INFO Executor: Running task 96.0 in stage 13.0 (TID 2150)
17/12/07 04:24:01 INFO PythonRunner: Times: total = 47, boot = -61, init = 104, finish = 4
17/12/07 04:24:01 INFO Executor: Finished task 96.0 in stage 13.0 (TID 2150). 2553 bytes result sent to driver
17/12/07 04:24:01 INFO CoarseGrainedExecutorBackend: Got assigned task 2175
17/12/07 04:24:01 INFO Executor: Running task 118.0 in stage 13.0 (TID 2175)
17/12/07 04:24:01 INFO PythonRunner: Times: total = 49, boot = -41, init = 86, finish = 4
17/12/07 04:24:01 INFO Executor: Finished task 118.0 in stage 13.0 (TID 2175). 2553 bytes result sent to driver
17/12/07 04:24:01 INFO CoarseGrainedExecutorBackend: Got assigned task 2202
17/12/07 04:24:01 INFO Executor: Running task 139.0 in stage 13.0 (TID 2202)
17/12/07 04:24:01 INFO PythonRunner: Times: total = 49, boot = -33, init = 78, finish = 4
17/12/07 04:24:01 INFO Executor: Finished task 139.0 in stage 13.0 (TID 2202). 2553 bytes result sent to driver
17/12/07 04:24:01 INFO CoarseGrainedExecutorBackend: Got assigned task 2228
17/12/07 04:24:01 INFO Executor: Running task 160.0 in stage 13.0 (TID 2228)
17/12/07 04:24:01 INFO PythonRunner: Times: total = 47, boot = -31, init = 74, finish = 4
17/12/07 04:24:01 INFO Executor: Finished task 160.0 in stage 13.0 (TID 2228). 2553 bytes result sent to driver
17/12/07 04:24:01 INFO CoarseGrainedExecutorBackend: Got assigned task 2263
17/12/07 04:24:01 INFO Executor: Running task 184.0 in stage 13.0 (TID 2263)
17/12/07 04:24:01 INFO PythonRunner: Times: total = 48, boot = -27, init = 71, finish = 4
17/12/07 04:24:01 INFO Executor: Finished task 184.0 in stage 13.0 (TID 2263). 2553 bytes result sent to driver
17/12/07 04:24:01 INFO CoarseGrainedExecutorBackend: Got assigned task 2303
17/12/07 04:24:01 INFO Executor: Running task 212.0 in stage 13.0 (TID 2303)
17/12/07 04:24:01 INFO PythonRunner: Times: total = 48, boot = -27, init = 69, finish = 6
17/12/07 04:24:01 INFO Executor: Finished task 212.0 in stage 13.0 (TID 2303). 2553 bytes result sent to driver
17/12/07 04:24:01 INFO CoarseGrainedExecutorBackend: Got assigned task 2350
17/12/07 04:24:01 INFO Executor: Running task 253.0 in stage 13.0 (TID 2350)
17/12/07 04:24:02 INFO PythonRunner: Times: total = 48, boot = -42, init = 86, finish = 4
17/12/07 04:24:02 INFO Executor: Finished task 253.0 in stage 13.0 (TID 2350). 2553 bytes result sent to driver
17/12/07 04:24:02 INFO CoarseGrainedExecutorBackend: Got assigned task 2396
17/12/07 04:24:02 INFO Executor: Running task 292.0 in stage 13.0 (TID 2396)
17/12/07 04:24:02 INFO PythonRunner: Times: total = 47, boot = -22, init = 66, finish = 3
17/12/07 04:24:02 INFO Executor: Finished task 292.0 in stage 13.0 (TID 2396). 2553 bytes result sent to driver
17/12/07 04:24:02 INFO CoarseGrainedExecutorBackend: Got assigned task 2439
17/12/07 04:24:02 INFO Executor: Running task 330.0 in stage 13.0 (TID 2439)
17/12/07 04:24:02 INFO PythonRunner: Times: total = 49, boot = -25, init = 71, finish = 3
17/12/07 04:24:02 INFO Executor: Finished task 330.0 in stage 13.0 (TID 2439). 2553 bytes result sent to driver
17/12/07 04:24:02 INFO CoarseGrainedExecutorBackend: Got assigned task 2488
17/12/07 04:24:02 INFO Executor: Running task 375.0 in stage 13.0 (TID 2488)
17/12/07 04:24:02 INFO PythonRunner: Times: total = 45, boot = -28, init = 70, finish = 3
17/12/07 04:24:02 INFO Executor: Finished task 375.0 in stage 13.0 (TID 2488). 2553 bytes result sent to driver
17/12/07 04:24:02 INFO CoarseGrainedExecutorBackend: Got assigned task 2532
17/12/07 04:24:02 INFO Executor: Running task 0.0 in stage 11.0 (TID 2532)
17/12/07 04:24:02 INFO MapOutputTrackerWorker: Updating epoch to 6 and clearing cache
17/12/07 04:24:02 INFO TorrentBroadcast: Started reading broadcast variable 21
17/12/07 04:24:02 INFO TransportClientFactory: Successfully created connection to /10.64.53.30:35539 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:24:02 INFO MemoryStore: Block broadcast_21_piece0 stored as bytes in memory (estimated size 17.3 KB, free 2003.3 MB)
17/12/07 04:24:02 INFO TorrentBroadcast: Reading broadcast variable 21 took 9 ms
17/12/07 04:24:02 INFO MemoryStore: Block broadcast_21 stored as values in memory (estimated size 34.3 KB, free 2003.2 MB)
17/12/07 04:24:02 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 11, fetching them
17/12/07 04:24:02 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://MapOutputTracker@10.64.53.30:40931)
17/12/07 04:24:02 INFO MapOutputTrackerWorker: Got the output locations
17/12/07 04:24:02 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 200 blocks
17/12/07 04:24:02 INFO ShuffleBlockFetcherIterator: Started 7 remote fetches in 7 ms
17/12/07 04:24:02 INFO CodeGenerator: Code generated in 11.16376 ms
17/12/07 04:24:02 INFO CodeGenerator: Code generated in 18.003127 ms
17/12/07 04:24:13 INFO MemoryStore: Will not store rdd_73_0
17/12/07 04:24:13 WARN MemoryStore: Not enough space to cache rdd_73_0 in memory! (computed 240.2 MB so far)
17/12/07 04:24:13 INFO MemoryStore: Memory use = 1396.5 KB (blocks) + 230.1 MB (scratch space shared across 1 tasks(s)) = 231.5 MB. Storage limit = 308.6 MB.
17/12/07 04:24:13 WARN BlockManager: Persisting block rdd_73_0 to disk instead.
17/12/07 04:24:23 INFO MemoryStore: Block rdd_73_0 stored as values in memory (estimated size 949.9 MB, free 1053.4 MB)
17/12/07 04:24:23 INFO CodeGenerator: Code generated in 5.893662 ms
17/12/07 04:24:23 INFO CodeGenerator: Code generated in 21.329799 ms
17/12/07 04:24:23 INFO CodeGenerator: Code generated in 8.402581 ms
17/12/07 04:24:25 INFO MemoryStore: Block taskresult_2532 stored as bytes in memory (estimated size 4.8 MB, free 1048.6 MB)
17/12/07 04:24:25 INFO Executor: Finished task 0.0 in stage 11.0 (TID 2532). 4994683 bytes result sent via BlockManager)
17/12/07 04:24:31 INFO CoarseGrainedExecutorBackend: Got assigned task 3740
17/12/07 04:24:31 INFO Executor: Running task 59.0 in stage 20.0 (TID 3740)
17/12/07 04:24:31 INFO MapOutputTrackerWorker: Updating epoch to 12 and clearing cache
17/12/07 04:24:31 INFO TorrentBroadcast: Started reading broadcast variable 27
17/12/07 04:24:31 INFO TransportClientFactory: Successfully created connection to analytics1039.eqiad.wmnet/10.64.53.18:34883 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:24:31 INFO MemoryStore: Block broadcast_27_piece0 stored as bytes in memory (estimated size 9.3 KB, free 1053.4 MB)
17/12/07 04:24:31 INFO TorrentBroadcast: Reading broadcast variable 27 took 11 ms
17/12/07 04:24:31 INFO MemoryStore: Block broadcast_27 stored as values in memory (estimated size 18.0 KB, free 1053.4 MB)
17/12/07 04:24:31 INFO CodeGenerator: Code generated in 22.48162 ms
17/12/07 04:24:31 INFO PythonRunner: Times: total = 49, boot = -29033, init = 29078, finish = 4
17/12/07 04:24:31 INFO Executor: Finished task 59.0 in stage 20.0 (TID 3740). 3068 bytes result sent to driver
17/12/07 04:24:31 INFO CoarseGrainedExecutorBackend: Got assigned task 3771
17/12/07 04:24:31 INFO Executor: Running task 90.0 in stage 20.0 (TID 3771)
17/12/07 04:24:31 INFO PythonRunner: Times: total = 48, boot = -45, init = 89, finish = 4
17/12/07 04:24:31 INFO Executor: Finished task 90.0 in stage 20.0 (TID 3771). 2553 bytes result sent to driver
17/12/07 04:24:31 INFO CoarseGrainedExecutorBackend: Got assigned task 3829
17/12/07 04:24:31 INFO Executor: Running task 148.0 in stage 20.0 (TID 3829)
17/12/07 04:24:31 INFO PythonRunner: Times: total = 45, boot = -21, init = 63, finish = 3
17/12/07 04:24:31 INFO Executor: Finished task 148.0 in stage 20.0 (TID 3829). 2553 bytes result sent to driver
17/12/07 04:24:31 INFO CoarseGrainedExecutorBackend: Got assigned task 3885
17/12/07 04:24:31 INFO Executor: Running task 204.0 in stage 20.0 (TID 3885)
17/12/07 04:24:31 INFO PythonRunner: Times: total = 50, boot = -21, init = 67, finish = 4
17/12/07 04:24:31 INFO Executor: Finished task 204.0 in stage 20.0 (TID 3885). 2553 bytes result sent to driver
17/12/07 04:24:31 INFO CoarseGrainedExecutorBackend: Got assigned task 3946
17/12/07 04:24:31 INFO Executor: Running task 265.0 in stage 20.0 (TID 3946)
17/12/07 04:24:31 INFO PythonRunner: Times: total = 48, boot = -21, init = 65, finish = 4
17/12/07 04:24:31 INFO Executor: Finished task 265.0 in stage 20.0 (TID 3946). 2553 bytes result sent to driver
17/12/07 04:24:31 INFO CoarseGrainedExecutorBackend: Got assigned task 4003
17/12/07 04:24:31 INFO Executor: Running task 322.0 in stage 20.0 (TID 4003)
17/12/07 04:24:31 INFO PythonRunner: Times: total = 48, boot = -20, init = 64, finish = 4
17/12/07 04:24:31 INFO Executor: Finished task 322.0 in stage 20.0 (TID 4003). 2553 bytes result sent to driver
17/12/07 04:24:31 INFO CoarseGrainedExecutorBackend: Got assigned task 4063
17/12/07 04:24:31 INFO Executor: Running task 8.0 in stage 21.0 (TID 4063)
17/12/07 04:24:31 INFO TorrentBroadcast: Started reading broadcast variable 28
17/12/07 04:24:31 INFO TransportClientFactory: Successfully created connection to analytics1055.eqiad.wmnet/10.64.5.18:32823 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:24:31 INFO MemoryStore: Block broadcast_28_piece0 stored as bytes in memory (estimated size 7.6 KB, free 1053.3 MB)
17/12/07 04:24:31 INFO TorrentBroadcast: Reading broadcast variable 28 took 9 ms
17/12/07 04:24:31 INFO MemoryStore: Block broadcast_28 stored as values in memory (estimated size 18.1 KB, free 1053.3 MB)
17/12/07 04:24:31 INFO TorrentBroadcast: Started reading broadcast variable 26
17/12/07 04:24:31 INFO MemoryStore: Block broadcast_26_piece0 stored as bytes in memory (estimated size 29.3 KB, free 1053.3 MB)
17/12/07 04:24:31 INFO TorrentBroadcast: Reading broadcast variable 26 took 7 ms
17/12/07 04:24:31 INFO MemoryStore: Block broadcast_26 stored as values in memory (estimated size 381.7 KB, free 1052.9 MB)
17/12/07 04:24:31 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:31 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:31 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:31 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:33 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:33 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:33 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:33 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:34 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:34 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:34 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:34 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:35 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:35 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:36 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:36 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:36 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 0 records.
17/12/07 04:24:36 INFO Executor: Finished task 369.0 in stage 21.0 (TID 4377). 1733 bytes result sent to driver
17/12/07 04:24:37 INFO CoarseGrainedExecutorBackend: Got assigned task 4496
17/12/07 04:24:38 INFO Executor: Running task 31.0 in stage 22.0 (TID 4496)
17/12/07 04:24:38 INFO MapOutputTrackerWorker: Updating epoch to 14 and clearing cache
17/12/07 04:24:38 INFO TorrentBroadcast: Started reading broadcast variable 30
17/12/07 04:24:38 INFO MemoryStore: Block broadcast_30_piece0 stored as bytes in memory (estimated size 18.3 KB, free 1052.9 MB)
17/12/07 04:24:38 INFO TorrentBroadcast: Reading broadcast variable 30 took 13 ms
17/12/07 04:24:38 INFO MemoryStore: Block broadcast_30 stored as values in memory (estimated size 41.3 KB, free 1052.9 MB)
17/12/07 04:24:38 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 12, fetching them
17/12/07 04:24:38 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://MapOutputTracker@10.64.53.30:40931)
17/12/07 04:24:38 INFO MapOutputTrackerWorker: Got the output locations
17/12/07 04:24:38 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:24:38 INFO ShuffleBlockFetcherIterator: Started 60 remote fetches in 25 ms
17/12/07 04:24:38 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 13, fetching them
17/12/07 04:24:38 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://MapOutputTracker@10.64.53.30:40931)
17/12/07 04:24:38 INFO MapOutputTrackerWorker: Got the output locations
17/12/07 04:24:38 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 378 blocks
17/12/07 04:24:38 INFO ShuffleBlockFetcherIterator: Started 2 remote fetches in 1 ms
17/12/07 04:24:38 INFO Executor: Finished task 31.0 in stage 22.0 (TID 4496). 4712 bytes result sent to driver
17/12/07 04:24:38 INFO CoarseGrainedExecutorBackend: Got assigned task 4555
17/12/07 04:24:38 INFO Executor: Running task 108.0 in stage 22.0 (TID 4555)
17/12/07 04:24:38 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:24:38 INFO ShuffleBlockFetcherIterator: Started 60 remote fetches in 27 ms
17/12/07 04:24:38 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 378 blocks
17/12/07 04:24:38 INFO ShuffleBlockFetcherIterator: Started 2 remote fetches in 1 ms
17/12/07 04:24:39 INFO Executor: Finished task 108.0 in stage 22.0 (TID 4555). 4284 bytes result sent to driver
17/12/07 04:24:39 INFO CoarseGrainedExecutorBackend: Got assigned task 4632
17/12/07 04:24:39 INFO Executor: Running task 184.0 in stage 22.0 (TID 4632)
17/12/07 04:24:39 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:24:39 INFO ShuffleBlockFetcherIterator: Started 60 remote fetches in 15 ms
17/12/07 04:24:39 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 378 blocks
17/12/07 04:24:39 INFO ShuffleBlockFetcherIterator: Started 3 remote fetches in 1 ms
17/12/07 04:24:39 INFO Executor: Finished task 184.0 in stage 22.0 (TID 4632). 4197 bytes result sent to driver
17/12/07 04:24:39 INFO CoarseGrainedExecutorBackend: Got assigned task 4733
17/12/07 04:24:39 INFO Executor: Running task 76.0 in stage 24.0 (TID 4733)
17/12/07 04:24:39 INFO TorrentBroadcast: Started reading broadcast variable 31
17/12/07 04:24:39 INFO TransportClientFactory: Successfully created connection to analytics1053.eqiad.wmnet/10.64.5.16:34421 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:24:39 INFO MemoryStore: Block broadcast_31_piece0 stored as bytes in memory (estimated size 9.3 KB, free 1052.9 MB)
17/12/07 04:24:39 INFO TorrentBroadcast: Reading broadcast variable 31 took 36 ms
17/12/07 04:24:39 INFO MemoryStore: Block broadcast_31 stored as values in memory (estimated size 18.0 KB, free 1052.9 MB)
17/12/07 04:24:39 INFO CodeGenerator: Code generated in 20.977209 ms
17/12/07 04:24:40 INFO PythonRunner: Times: total = 51, boot = -8097, init = 8144, finish = 4
17/12/07 04:24:40 INFO Executor: Finished task 76.0 in stage 24.0 (TID 4733). 3068 bytes result sent to driver
17/12/07 04:24:40 INFO CoarseGrainedExecutorBackend: Got assigned task 4804
17/12/07 04:24:40 INFO Executor: Running task 145.0 in stage 24.0 (TID 4804)
17/12/07 04:24:40 INFO PythonRunner: Times: total = 48, boot = -40, init = 84, finish = 4
17/12/07 04:24:40 INFO Executor: Finished task 145.0 in stage 24.0 (TID 4804). 2553 bytes result sent to driver
17/12/07 04:24:40 INFO CoarseGrainedExecutorBackend: Got assigned task 4855
17/12/07 04:24:40 INFO Executor: Running task 196.0 in stage 24.0 (TID 4855)
17/12/07 04:24:40 INFO PythonRunner: Times: total = 49, boot = -21, init = 66, finish = 4
17/12/07 04:24:40 INFO Executor: Finished task 196.0 in stage 24.0 (TID 4855). 2553 bytes result sent to driver
17/12/07 04:24:40 INFO CoarseGrainedExecutorBackend: Got assigned task 4912
17/12/07 04:24:40 INFO Executor: Running task 253.0 in stage 24.0 (TID 4912)
17/12/07 04:24:40 INFO PythonRunner: Times: total = 49, boot = -18, init = 63, finish = 4
17/12/07 04:24:40 INFO Executor: Finished task 253.0 in stage 24.0 (TID 4912). 2553 bytes result sent to driver
17/12/07 04:24:40 INFO CoarseGrainedExecutorBackend: Got assigned task 4971
17/12/07 04:24:40 INFO Executor: Running task 312.0 in stage 24.0 (TID 4971)
17/12/07 04:24:40 INFO PythonRunner: Times: total = 46, boot = -63, init = 105, finish = 4
17/12/07 04:24:40 INFO Executor: Finished task 312.0 in stage 24.0 (TID 4971). 2553 bytes result sent to driver
17/12/07 04:24:40 INFO CoarseGrainedExecutorBackend: Got assigned task 5029
17/12/07 04:24:40 INFO Executor: Running task 370.0 in stage 24.0 (TID 5029)
17/12/07 04:24:40 INFO PythonRunner: Times: total = 49, boot = -72, init = 117, finish = 4
17/12/07 04:24:40 INFO Executor: Finished task 370.0 in stage 24.0 (TID 5029). 2553 bytes result sent to driver
17/12/07 04:24:40 INFO CoarseGrainedExecutorBackend: Got assigned task 5101
17/12/07 04:24:40 INFO Executor: Running task 43.0 in stage 25.0 (TID 5101)
17/12/07 04:24:40 INFO TorrentBroadcast: Started reading broadcast variable 32
17/12/07 04:24:40 INFO MemoryStore: Block broadcast_32_piece0 stored as bytes in memory (estimated size 7.6 KB, free 1052.9 MB)
17/12/07 04:24:40 INFO TorrentBroadcast: Reading broadcast variable 32 took 7 ms
17/12/07 04:24:40 INFO MemoryStore: Block broadcast_32 stored as values in memory (estimated size 18.1 KB, free 1052.8 MB)
17/12/07 04:24:40 INFO TorrentBroadcast: Started reading broadcast variable 29
17/12/07 04:24:40 INFO MemoryStore: Block broadcast_29_piece0 stored as bytes in memory (estimated size 29.3 KB, free 1052.8 MB)
17/12/07 04:24:40 INFO TorrentBroadcast: Reading broadcast variable 29 took 7 ms
17/12/07 04:24:40 INFO MemoryStore: Block broadcast_29 stored as values in memory (estimated size 381.7 KB, free 1052.4 MB)
17/12/07 04:24:40 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:40 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:40 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:40 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:41 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:41 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:41 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:41 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:42 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:42 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:42 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:42 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:43 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:43 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:43 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:43 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:45 INFO TorrentBroadcast: Started reading broadcast variable 33
17/12/07 04:24:45 INFO MemoryStore: Block broadcast_33_piece0 stored as bytes in memory (estimated size 29.3 KB, free 1052.4 MB)
17/12/07 04:24:45 INFO TorrentBroadcast: Reading broadcast variable 33 took 5 ms
17/12/07 04:24:45 INFO MemoryStore: Block broadcast_33 stored as values in memory (estimated size 381.7 KB, free 1052.0 MB)
17/12/07 04:24:45 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:45 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:45 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:45 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:46 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:46 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:46 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:46 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:46 INFO InternalParquetRecordReader: RecordReader initialized will read a total of 453906 records.
17/12/07 04:24:46 INFO InternalParquetRecordReader: at row 0. reading next block
17/12/07 04:24:46 INFO InternalParquetRecordReader: block read in memory in 37 ms. row count = 453906
17/12/07 04:24:48 INFO Executor: Finished task 92.0 in stage 29.0 (TID 5892). 2156 bytes result sent to driver
17/12/07 04:24:48 INFO CoarseGrainedExecutorBackend: Got assigned task 5979
17/12/07 04:24:48 INFO Executor: Running task 40.0 in stage 26.0 (TID 5979)
17/12/07 04:24:48 INFO MapOutputTrackerWorker: Updating epoch to 18 and clearing cache
17/12/07 04:24:48 INFO TorrentBroadcast: Started reading broadcast variable 37
17/12/07 04:24:48 INFO TransportClientFactory: Successfully created connection to analytics1043.eqiad.wmnet/10.64.53.23:41817 after 1 ms (0 ms spent in bootstraps)
17/12/07 04:24:48 INFO MemoryStore: Block broadcast_37_piece0 stored as bytes in memory (estimated size 18.3 KB, free 1052.0 MB)
17/12/07 04:24:48 INFO TorrentBroadcast: Reading broadcast variable 37 took 12 ms
17/12/07 04:24:48 INFO MemoryStore: Block broadcast_37 stored as values in memory (estimated size 41.3 KB, free 1051.9 MB)
17/12/07 04:24:48 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 15, fetching them
17/12/07 04:24:48 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://MapOutputTracker@10.64.53.30:40931)
17/12/07 04:24:48 INFO MapOutputTrackerWorker: Got the output locations
17/12/07 04:24:48 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:24:48 INFO ShuffleBlockFetcherIterator: Started 61 remote fetches in 21 ms
17/12/07 04:24:48 INFO MapOutputTrackerWorker: Don't have map outputs for shuffle 16, fetching them
17/12/07 04:24:48 INFO MapOutputTrackerWorker: Doing the fetch; tracker endpoint = NettyRpcEndpointRef(spark://MapOutputTracker@10.64.53.30:40931)
17/12/07 04:24:48 INFO MapOutputTrackerWorker: Got the output locations
17/12/07 04:24:48 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 378 blocks
17/12/07 04:24:48 INFO ShuffleBlockFetcherIterator: Started 2 remote fetches in 1 ms
17/12/07 04:24:48 INFO Executor: Finished task 40.0 in stage 26.0 (TID 5979). 4712 bytes result sent to driver
17/12/07 04:24:48 INFO CoarseGrainedExecutorBackend: Got assigned task 6031
17/12/07 04:24:48 INFO Executor: Running task 119.0 in stage 26.0 (TID 6031)
17/12/07 04:24:48 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:24:48 INFO ShuffleBlockFetcherIterator: Started 61 remote fetches in 14 ms
17/12/07 04:24:48 INFO ShuffleBlockFetcherIterator: Getting 2 non-empty blocks out of 378 blocks
17/12/07 04:24:48 INFO ShuffleBlockFetcherIterator: Started 2 remote fetches in 0 ms
17/12/07 04:24:49 INFO Executor: Finished task 119.0 in stage 26.0 (TID 6031). 4197 bytes result sent to driver
17/12/07 04:24:49 INFO CoarseGrainedExecutorBackend: Got assigned task 6119
17/12/07 04:24:49 INFO Executor: Running task 181.0 in stage 26.0 (TID 6119)
17/12/07 04:24:49 INFO ShuffleBlockFetcherIterator: Getting 200 non-empty blocks out of 400 blocks
17/12/07 04:24:49 INFO ShuffleBlockFetcherIterator: Started 61 remote fetches in 16 ms
17/12/07 04:24:49 INFO ShuffleBlockFetcherIterator: Getting 3 non-empty blocks out of 378 blocks
17/12/07 04:24:49 INFO ShuffleBlockFetcherIterator: Started 3 remote fetches in 1 ms
17/12/07 04:24:50 INFO Executor: Finished task 181.0 in stage 26.0 (TID 6119). 4197 bytes result sent to driver
17/12/07 04:24:50 INFO CoarseGrainedExecutorBackend: Got assigned task 6180
17/12/07 04:24:50 INFO Executor: Running task 148.0 in stage 29.0 (TID 6180)
17/12/07 04:24:50 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:50 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:50 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:50 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 INFO ParquetReadSupport: Going to read the following fields from the Parquet file:
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 WARN ParquetRecordReader: Can not initialize counter due to context is not a instance of TaskInputOutputContext, but is org.apache.hadoop.mapreduce.task.TaskAttemptContextImpl
17/12/07 04:24:51 INFO FilterCompat: Filtering using predicate: noteq(norm_query_id, null)
17/12/07 04:24:51 INFO ParquetReadSupport: Going to read the following fields from the Parquet file: