`

[spark-src-core] 3.run spark in cluster(local) mode

 
阅读更多

  yep ,just the same with your guess,there are many deploy modes in spark,eg standalone,yarn,mesos etc.go advance step,the standalone mode can be devided into standalone,cluster(local) mode.the former is a real cluster mode which the master and workers are all run in individual nodes,while the later is that master and worker are run in the same jvm,and the executor is launched in another one.

   lemme have a glance at the launch birdview:


 

 

some steps to setup cluster(local) mode in spark:

1.run in IDE

  it's different from pure local depoy mode,cluster(local) mode will launch executor,that means some jars will need to be setup the process to execute user class.so  a exeception will occur listed below:

java.lang.NoClassDefFoundError: org/apache/mesos/Protos$TaskState

 2.run outside of IDE 

  2.1 generate a whole jar which conatins all the necessary spark,hadoop,and other thirty-jars

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.2 -DskipTests clean package

   now u wil see a jar named "spark-assembly-1.4.1-hadoop2.5.2.jar" lied in folder:assembly/target/scala-2.10/ and exampled jar named "spark-examples-1.4.1-hadoop2.5.2.jar" lied in examples/target/scala-2.10/

 

  2.2 setup certain env variables which the executor needs.

HADOOP_HOME=/path/to/hadoop
SPARK_HOME=/path/to/spark
export HADOOP_HOME
export SPARK_HOME

   

  2.3 issue the example cmd

run-example ScalaWordCount

    then you will see the concrete proxy class named "SparkSubmit" to be launched

executed cmd:/Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home/bin/java -cp /Users/userxx/Cloud/Spark/spark-1.4.1/conf/:/Users/userxx/Cloud/Spark/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.5.2.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-core-3.2.10.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master local[*] --class org.apache.spark.examples.ScalaWordCount /Users/userxx/Cloud/Spark/spark-1.4.1/examples/target/scala-2.10/spark-examples-1.4.1-hadoop2.5.2.jar

 

3 logs in starting up this cluster mode

 

MacBook:spark-1.4.1 userxx$ run-example ScalaWordCount
Spark Command: /Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home/bin/java -cp /Users/userxx/Cloud/Spark/spark-1.4.1/conf/:/Users/userxx/Cloud/Spark/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.5.2.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-core-3.2.10.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar -Xms512m -Xmx512m -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master local[*] --class org.apache.spark.examples.ScalaWordCount /Users/userxx/Cloud/Spark/spark-1.4.1/examples/target/scala-2.10/spark-examples-1.4.1-hadoop2.5.2.jar
========================================
-child main class to launch this app org.apache.spark.examples.ScalaWordCount
16/09/02 15:56:22 INFO SparkContext: Running Spark version 1.4.1
16/09/02 15:56:22 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of successful kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
16/09/02 15:56:22 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of failed kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
16/09/02 15:56:22 DEBUG MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[GetGroups], always=false, type=DEFAULT, sampleName=Ops)
16/09/02 15:56:22 DEBUG MetricsSystemImpl: UgiMetrics, User and group related metrics
2016-09-02 15:56:22.527 java[1449:1903] Unable to load realm info from SCDynamicStore
16/09/02 15:56:22 DEBUG KerberosName: Kerberos krb5 configuration not found, setting default realm to empty
16/09/02 15:56:22 DEBUG Groups:  Creating new Groups object
16/09/02 15:56:22 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
16/09/02 15:56:22 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
16/09/02 15:56:22 DEBUG NativeCodeLoader: java.library.path=/Users/userxx/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
16/09/02 15:56:22 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/09/02 15:56:22 DEBUG JniBasedUnixGroupsMappingWithFallback: Falling back to shell based
16/09/02 15:56:22 DEBUG JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
16/09/02 15:56:22 DEBUG Shell: setsid is not available on this machine. So not using it.
16/09/02 15:56:22 DEBUG Shell: setsid exited with exit code 0
16/09/02 15:56:22 DEBUG Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
16/09/02 15:56:22 DEBUG UserGroupInformation: hadoop login
16/09/02 15:56:22 DEBUG UserGroupInformation: hadoop login commit
16/09/02 15:56:22 DEBUG UserGroupInformation: using local user:UnixPrincipal: userxx
16/09/02 15:56:22 DEBUG UserGroupInformation: UGI loginUser:userxx (auth:SIMPLE)
16/09/02 15:56:22 WARN Utils: Your hostname, MacBook resolves to a loopback address: 127.0.0.1; using 192.168.1.136 instead (on interface en0)
16/09/02 15:56:22 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
16/09/02 15:56:22 INFO SecurityManager: Changing view acls to: userxx
16/09/02 15:56:22 INFO SecurityManager: Changing modify acls to: userxx
16/09/02 15:56:22 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(userxx); users with modify permissions: Set(userxx)
16/09/02 15:56:22 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:23 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:23 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:23 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
16/09/02 15:56:23 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
16/09/02 15:56:23 DEBUG AkkaUtils: In createActorSystem, requireCookie is: off
16/09/02 15:56:23 INFO Slf4jLogger: Slf4jLogger started
16/09/02 15:56:23 INFO Remoting: Starting remoting
16/09/02 15:56:23 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.1.136:53712]
16/09/02 15:56:23 INFO Utils: Successfully started service 'sparkDriver' on port 53712.
16/09/02 15:56:23 DEBUG SparkEnv: Using serializer: class org.apache.spark.serializer.JavaSerializer
16/09/02 15:56:23 INFO SparkEnv: Registering MapOutputTracker
16/09/02 15:56:23 INFO AkkaRpcEnv: -endpontRef AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/MapOutputTracker#-1786103142]),actorRef class akka.actor.RepointableActorRef
16/09/02 15:56:23 INFO SparkEnv: -shuffle manager:org.apache.spark.shuffle.sort.SortShuffleManager@2ff94f45
16/09/02 15:56:23 INFO SparkEnv: Registering BlockManagerMaster
16/09/02 15:56:23 INFO AkkaRpcEnv: -endpontRef AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/BlockManagerMaster#960349843]),actorRef class akka.actor.RepointableActorRef
16/09/02 15:56:23 INFO DiskBlockManager: Created local directory at /private/var/folders/rt/6f6nq06577vb3c0d8bskm97m0000gn/T/spark-4cad14d1-49d6-4130-a484-2d5c264f67ab/blockmgr-c6228943-b29b-4942-adb7-9a49b227348e
16/09/02 15:56:23 INFO MemoryStore: MemoryStore started with capacity 265.4 MB
16/09/02 15:56:23 INFO AkkaRpcEnv: -endpontRef AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/BlockManagerEndpoint1#756110845]),actorRef class akka.actor.RepointableActorRef
16/09/02 15:56:23 INFO HttpFileServer: HTTP File server directory is /private/var/folders/rt/6f6nq06577vb3c0d8bskm97m0000gn/T/spark-4cad14d1-49d6-4130-a484-2d5c264f67ab/httpd-69dfaf66-b9cc-43e0-b300-204336b669bb
16/09/02 15:56:23 INFO HttpServer: Starting HTTP Server
16/09/02 15:56:24 DEBUG HttpServer: HttpServer is not using security
16/09/02 15:56:24 INFO Server: jetty-8.y.z-SNAPSHOT
16/09/02 15:56:24 INFO AbstractConnector: Started SocketConnector@0.0.0.0:53713
16/09/02 15:56:24 INFO Utils: Successfully started service 'HTTP file server' on port 53713.
16/09/02 15:56:24 DEBUG HttpFileServer: HTTP file server started at: http://192.168.1.136:53713
16/09/02 15:56:24 INFO SparkEnv: Registering OutputCommitCoordinator
16/09/02 15:56:24 INFO AkkaRpcEnv: -endpontRef AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/OutputCommitCoordinator#1524586475]),actorRef class akka.actor.RepointableActorRef
16/09/02 15:56:24 INFO Server: jetty-8.y.z-SNAPSHOT
16/09/02 15:56:24 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
16/09/02 15:56:24 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/09/02 15:56:24 INFO SparkUI: Started SparkUI at http://192.168.1.136:4040
16/09/02 15:56:24 INFO SparkContext: Added JAR file:/Users/userxx/Cloud/Spark/spark-1.4.1/examples/target/scala-2.10/spark-examples-1.4.1-hadoop2.5.2.jar at http://192.168.1.136:53713/jars/spark-examples-1.4.1-hadoop2.5.2.jar with timestamp 1472802984541
16/09/02 15:56:24 INFO AkkaRpcEnv: -endpontRef AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/HeartbeatReceiver#-1337218123]),actorRef class akka.actor.RepointableActorRef
16/09/02 15:56:24 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ExpireDeadHosts,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:24 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ExpireDeadHosts,false)
16/09/02 15:56:24 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (3.489 ms) AkkaMessage(ExpireDeadHosts,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:24 INFO LocalSparkCluster: Starting a local Spark cluster with 1 workers.
16/09/02 15:56:24 INFO SecurityManager: Changing view acls to: userxx
16/09/02 15:56:24 INFO SecurityManager: Changing modify acls to: userxx
16/09/02 15:56:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(userxx); users with modify permissions: Set(userxx)
16/09/02 15:56:24 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:24 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:24 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:24 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
16/09/02 15:56:24 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
16/09/02 15:56:24 DEBUG AkkaUtils: In createActorSystem, requireCookie is: off
16/09/02 15:56:24 INFO Slf4jLogger: Slf4jLogger started
16/09/02 15:56:24 INFO Remoting: Starting remoting
16/09/02 15:56:24 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkMaster@192.168.1.136:53714]
16/09/02 15:56:24 INFO Utils: Successfully started service 'sparkMaster' on port 53714.
16/09/02 15:56:24 INFO Master: Starting Spark master at spark://192.168.1.136:53714
16/09/02 15:56:24 INFO Master: Running Spark version 1.4.1
16/09/02 15:56:24 INFO Server: jetty-8.y.z-SNAPSHOT
16/09/02 15:56:24 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:53715
16/09/02 15:56:24 INFO Utils: Successfully started service 'MasterUI' on port 53715.
16/09/02 15:56:24 INFO MasterWebUI: Started MasterWebUI at http://192.168.1.136:53715
16/09/02 15:56:24 DEBUG Master: [actor] received message BoundPortsRequest from Actor[akka://sparkMaster/temp/$a]
16/09/02 15:56:24 DEBUG Master: [actor] handled message (1.968 ms) BoundPortsRequest from Actor[akka://sparkMaster/temp/$a]
16/09/02 15:56:24 DEBUG Master: [actor] received message CheckForWorkerTimeOut from Actor[akka://sparkMaster/user/Master#1119045880]
16/09/02 15:56:24 DEBUG Master: [actor] handled message (1.754 ms) CheckForWorkerTimeOut from Actor[akka://sparkMaster/user/Master#1119045880]
16/09/02 15:56:24 DEBUG Master: [actor] received message ElectedLeader from Actor[akka://sparkMaster/user/Master#1119045880]
16/09/02 15:56:24 INFO SecurityManager: Changing view acls to: userxx
16/09/02 15:56:24 INFO SecurityManager: Changing modify acls to: userxx
16/09/02 15:56:24 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(userxx); users with modify permissions: Set(userxx)
16/09/02 15:56:24 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:24 INFO Master: I have been elected leader! New state: ALIVE
16/09/02 15:56:24 DEBUG Master: [actor] handled message (2.746 ms) ElectedLeader from Actor[akka://sparkMaster/user/Master#1119045880]
16/09/02 15:56:24 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:24 DEBUG SSLOptions: No SSL protocol specified
16/09/02 15:56:24 DEBUG SecurityManager: SSLConfiguration for file server: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
16/09/02 15:56:24 DEBUG SecurityManager: SSLConfiguration for Akka: SSLOptions{enabled=false, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
16/09/02 15:56:24 DEBUG AkkaUtils: In createActorSystem, requireCookie is: off
16/09/02 15:56:24 INFO Slf4jLogger: Slf4jLogger started
16/09/02 15:56:24 INFO Remoting: Starting remoting
16/09/02 15:56:24 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkWorker1@192.168.1.136:53716]
16/09/02 15:56:24 INFO Utils: Successfully started service 'sparkWorker1' on port 53716.
16/09/02 15:56:25 DEBUG InternalLoggerFactory: Using SLF4J as the default logging framework
16/09/02 15:56:25 DEBUG PlatformDependent0: java.nio.Buffer.address: available
16/09/02 15:56:25 DEBUG PlatformDependent0: sun.misc.Unsafe.theUnsafe: available
16/09/02 15:56:25 DEBUG PlatformDependent0: sun.misc.Unsafe.copyMemory: available
16/09/02 15:56:25 DEBUG PlatformDependent0: java.nio.Bits.unaligned: true
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(TaskSchedulerIsSet,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(TaskSchedulerIsSet,false)
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.133 ms) AkkaMessage(TaskSchedulerIsSet,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 INFO AkkaRpcEnv: -endpontRef AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/CoarseGrainedScheduler#2050875568]),actorRef class akka.actor.RepointableActorRef
16/09/02 15:56:25 DEBUG PlatformDependent: UID: 501
16/09/02 15:56:25 DEBUG PlatformDependent: Java version: 7
16/09/02 15:56:25 DEBUG PlatformDependent: -Dio.netty.noUnsafe: false
16/09/02 15:56:25 DEBUG PlatformDependent: sun.misc.Unsafe: available
16/09/02 15:56:25 DEBUG PlatformDependent: -Dio.netty.noJavassist: false
16/09/02 15:56:25 DEBUG PlatformDependent: Javassist: unavailable
16/09/02 15:56:25 DEBUG PlatformDependent: You don't have Javassist in your class path or you don't have enough permission to load dynamically generated classes.  Please check the configuration for better performance.
16/09/02 15:56:25 DEBUG PlatformDependent: -Dio.netty.tmpdir: /var/folders/rt/6f6nq06577vb3c0d8bskm97m0000gn/T (java.io.tmpdir)
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:25 DEBUG PlatformDependent: -Dio.netty.bitMode: 64 (sun.arch.data.model)
16/09/02 15:56:25 DEBUG PlatformDependent: -Dio.netty.noPreferDirect: false
16/09/02 15:56:25 INFO Worker: Starting Spark worker 192.168.1.136:53716 with 1 cores, 512.0 MB RAM
16/09/02 15:56:25 INFO Worker: Running Spark version 1.4.1
16/09/02 15:56:25 INFO AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@192.168.1.136:53714/user/Master...
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (40.611 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 INFO Worker: Spark home: /Users/userxx/Cloud/Spark/spark-1.4.1
16/09/02 15:56:25 INFO Server: jetty-8.y.z-SNAPSHOT
16/09/02 15:56:25 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:53717
16/09/02 15:56:25 INFO Utils: Successfully started service 'WorkerUI' on port 53717.
16/09/02 15:56:25 INFO WorkerWebUI: Started WorkerWebUI at http://192.168.1.136:53717
16/09/02 15:56:25 INFO Worker: Connecting to master akka.tcp://sparkMaster@192.168.1.136:53714/user/Master...
16/09/02 15:56:25 DEBUG Master: Received unexpected actor system event: Associated [akka.tcp://sparkMaster@192.168.1.136:53714] <- [akka.tcp://sparkDriver@192.168.1.136:53712]
16/09/02 15:56:25 DEBUG Master: Received unexpected actor system event: Associated [akka.tcp://sparkMaster@192.168.1.136:53714] <- [akka.tcp://sparkWorker1@192.168.1.136:53716]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anoden$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (4.41 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (3.569 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (4.587 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (3.455 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (2.64 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (4.272 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:25 DEBUG AppClient$ClientActor: Received unexpected actor system event: Associated [akka.tcp://sparkDriver@192.168.1.136:53712] -> [akka.tcp://sparkMaster@192.168.1.136:53714]
16/09/02 15:56:25 DEBUG Worker: Received unexpected actor system event: Associated [akka.tcp://sparkWorker1@192.168.1.136:53716] -> [akka.tcp://sparkMaster@192.168.1.136:53714]
16/09/02 15:56:25 DEBUG Master: [actor] received message RegisterWorker(worker-20160902155625-192.168.1.136-53716,192.168.1.136,53716,1,512,53717,192.168.1.136) from Actor[akka.tcp://sparkWorker1@192.168.1.136:53716/user/Worker#-1266321228]
16/09/02 15:56:25 INFO Master: Registering worker 192.168.1.136:53716 with 1 cores, 512.0 MB RAM
16/09/02 15:56:25 DEBUG Master: [actor] handled message (10.731 ms) RegisterWorker(worker-20160902155625-192.168.1.136-53716,192.168.1.136,53716,1,512,53717,192.168.1.136) from Actor[akka.tcp://sparkWorker1@192.168.1.136:53716/user/Worker#-1266321228]
16/09/02 15:56:25 DEBUG Master: [actor] received message RegisterApplication(ApplicationDescription(ScalaWordCount$)) from Actor[akka.tcp://sparkDriver@192.168.1.136:53712/user/$a#2053731478]
16/09/02 15:56:25 INFO Master: Registering app ScalaWordCount$
16/09/02 15:56:25 DEBUG Worker: [actor] received message RegisteredWorker(spark://192.168.1.136:53714,http://192.168.1.136:53715) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 INFO Worker: Successfully registered with master spark://192.168.1.136:53714
16/09/02 15:56:25 DEBUG Worker: [actor] handled message (2.864 ms) RegisteredWorker(spark://192.168.1.136:53714,http://192.168.1.136:53715) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 DEBUG Worker: [actor] received message SendHeartbeat from Actor[akka://sparkWorker1/user/Worker#-1266321228]
16/09/02 15:56:25 DEBUG Worker: [actor] handled message (0.2 ms) SendHeartbeat from Actor[akka://sparkWorker1/user/Worker#-1266321228]
16/09/02 15:56:25 INFO Master: Registered app ScalaWordCount$ with ID app-20160902155625-0000
16/09/02 15:56:25 DEBUG AppClient$ClientActor: [actor] received message RegisteredApplication(app-20160902155625-0000,spark://192.168.1.136:53714) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 INFO SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20160902155625-0000
16/09/02 15:56:25 DEBUG AppClient$ClientActor: [actor] handled message (1.351 ms) RegisteredApplication(app-20160902155625-0000,spark://192.168.1.136:53714) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 INFO Master: Launching executor app-20160902155625-0000/0 on worker worker-20160902155625-192.168.1.136-53716
16/09/02 15:56:25 DEBUG Master: [actor] handled message (19.216 ms) RegisterApplication(ApplicationDescription(ScalaWordCount$)) from Actor[akka.tcp://sparkDriver@192.168.1.136:53712/user/$a#2053731478]
16/09/02 15:56:25 DEBUG Master: [actor] received message Heartbeat(worker-20160902155625-192.168.1.136-53716) from Actor[akka.tcp://sparkWorker1@192.168.1.136:53716/user/Worker#-1266321228]
16/09/02 15:56:25 DEBUG Master: [actor] handled message (0.037 ms) Heartbeat(worker-20160902155625-192.168.1.136-53716) from Actor[akka.tcp://sparkWorker1@192.168.1.136:53716/user/Worker#-1266321228]
16/09/02 15:56:25 DEBUG MultithreadEventLoopGroup: -Dio.netty.eventLoopThreads: 8
16/09/02 15:56:25 DEBUG AppClient$ClientActor: [actor] received message ExecutorAdded(0,worker-20160902155625-192.168.1.136-53716,192.168.1.136:53716,1,512) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 INFO AppClient$ClientActor: Executor added: app-20160902155625-0000/0 on worker-20160902155625-192.168.1.136-53716 (192.168.1.136:53716) with 1 cores
16/09/02 15:56:25 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160902155625-0000/0 on hostPort 192.168.1.136:53716 with 1 cores, 512.0 MB RAM
16/09/02 15:56:25 DEBUG AppClient$ClientActor: [actor] handled message (3.19 ms) ExecutorAdded(0,worker-20160902155625-192.168.1.136-53716,192.168.1.136:53716,1,512) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 DEBUG Worker: [actor] received message LaunchExecutor(spark://192.168.1.136:53714,app-20160902155625-0000,0,ApplicationDescription(ScalaWordCount$),1,512) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 INFO Worker: Asked to launch executor app-20160902155625-0000/0 for ScalaWordCount$
16/09/02 15:56:25 DEBUG Master: [actor] received message ExecutorStateChanged(app-20160902155625-0000,0,RUNNING,None,None) from Actor[akka.tcp://sparkDriver@192.168.1.136:53712/user/$a#2053731478]
16/09/02 15:56:25 DEBUG Worker: [actor] handled message (22.256 ms) LaunchExecutor(spark://192.168.1.136:53714,app-20160902155625-0000,0,ApplicationDescription(ScalaWordCount$),1,512) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 DEBUG Master: [actor] handled message (4.121 ms) ExecutorStateChanged(app-20160902155625-0000,0,RUNNING,None,None) from Actor[akka.tcp://sparkDriver@192.168.1.136:53712/user/$a#2053731478]
16/09/02 15:56:25 DEBUG Master: [actor] received message ExecutorStateChanged(app-20160902155625-0000,0,LOADING,None,None) from Actor[akka.tcp://sparkWorker1@192.168.1.136:53716/user/Worker#-1266321228]
16/09/02 15:56:25 DEBUG AppClient$ClientActor: [actor] received message ExecutorUpdated(0,RUNNING,None,None) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 DEBUG Master: [actor] handled message (0.365 ms) ExecutorStateChanged(app-20160902155625-0000,0,LOADING,None,None) from Actor[akka.tcp://sparkWorker1@192.168.1.136:53716/user/Worker#-1266321228]
16/09/02 15:56:25 INFO AppClient$ClientActor: Executor updated: app-20160902155625-0000/0 is now RUNNING
16/09/02 15:56:25 DEBUG AppClient$ClientActor: [actor] handled message (1.808 ms) ExecutorUpdated(0,RUNNING,None,None) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 DEBUG NioEventLoop: -Dio.netty.noKeySetOptimization: false
16/09/02 15:56:25 DEBUG NioEventLoop: -Dio.netty.selectorAutoRebuildThreshold: 512
16/09/02 15:56:25 INFO ExecutorRunner: Launch command: "/Library/Java/JavaVirtualMachines/jdk1.7.0_45.jdk/Contents/Home/bin/java" "-cp" "/Users/userxx/Cloud/Spark/spark-1.4.1/conf/:/Users/userxx/Cloud/Spark/spark-1.4.1/assembly/target/scala-2.10/spark-assembly-1.4.1-hadoop2.5.2.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-core-3.2.10.jar:/Users/userxx/Cloud/Spark/spark-1.4.1/lib_managed/jars/datanucleus-rdbms-3.2.9.jar" "-Xms512M" "-Xmx512M" "-Dspark.driver.port=53712" "-XX:MaxPermSize=256m" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "akka.tcp://sparkDriver@192.168.1.136:53712/user/CoarseGrainedScheduler" "--executor-id" "0" "--hostname" "192.168.1.136" "--cores" "1" "--app-id" "app-20160902155625-0000" "--worker-url" "akka.tcp://sparkWorker1@192.168.1.136:53716/user/Worker"
16/09/02 15:56:25 DEBUG AppClient$ClientActor: [actor] received message ExecutorUpdated(0,LOADING,None,None) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 INFO AppClient$ClientActor: Executor updated: app-20160902155625-0000/0 is now LOADING
16/09/02 15:56:25 DEBUG AppClient$ClientActor: [actor] handled message (5.322 ms) ExecutorUpdated(0,LOADING,None,None) from Actor[akka.tcp://sparkMaster@192.168.1.136:53714/user/Master#1119045880]
16/09/02 15:56:25 DEBUG FileAppender: Started appending thread
16/09/02 15:56:25 DEBUG FileAppender: Started appending thread
16/09/02 15:56:25 DEBUG FileAppender: Opened file /Users/userxx/Cloud/Spark/spark-1.4.1/work/app-20160902155625-0000/0/stderr
16/09/02 15:56:25 DEBUG FileAppender: Opened file /Users/userxx/Cloud/Spark/spark-1.4.1/work/app-20160902155625-0000/0/stdout
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.numHeapArenas: 4
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.numDirectArenas: 4
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.pageSize: 8192
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.maxOrder: 11
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.chunkSize: 16777216
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.tinyCacheSize: 512
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.smallCacheSize: 256
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.normalCacheSize: 64
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.maxCachedBufferCapacity: 32768
16/09/02 15:56:25 DEBUG PooledByteBufAllocator: -Dio.netty.allocator.cacheTrimInterval: 8192
16/09/02 15:56:25 DEBUG ThreadLocalRandom: -Dio.netty.initialSeedUniquifier: 0x51dc83d11932c25e (took 1 ms)
16/09/02 15:56:25 DEBUG ByteBufUtil: -Dio.netty.allocator.type: unpooled
16/09/02 15:56:25 DEBUG ByteBufUtil: -Dio.netty.threadLocalDirectBufferSize: 65536
16/09/02 15:56:25 DEBUG NetUtil: Loopback interface: lo0 (lo0, 0:0:0:0:0:0:0:1)
16/09/02 15:56:25 DEBUG NetUtil: /proc/sys/net/core/somaxconn: 128 (non-existent)
16/09/02 15:56:25 DEBUG TransportServer: Shuffle server started on port :53720
16/09/02 15:56:25 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 53720.
16/09/02 15:56:25 INFO NettyBlockTransferService: Server created on 53720
16/09/02 15:56:25 INFO BlockManagerMaster: Trying to register BlockManager
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(RegisterBlockManager(BlockManagerId(driver, 192.168.1.136, 53720),278302556,AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/BlockManagerEndpoint1#756110845])),true) from Actor[akka://sparkDriver/temp/$a]
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(RegisterBlockManager(BlockManagerId(driver, 192.168.1.136, 53720),278302556,AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/BlockManagerEndpoint1#756110845])),true)
16/09/02 15:56:25 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.136:53720 with 265.4 MB RAM, BlockManagerId(driver, 192.168.1.136, 53720)
16/09/02 15:56:25 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (14.196 ms) AkkaMessage(RegisterBlockManager(BlockManagerId(driver, 192.168.1.136, 53720),278302556,AkkaRpcEndpointRef(Actor[akka://sparkDriver/user/BlockManagerEndpoint1#756110845])),true) from Actor[akka://sparkDriver/temp/$a]
16/09/02 15:56:25 INFO BlockManagerMaster: Registered BlockManager
16/09/02 15:56:25 INFO SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.078 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:26 INFO MemoryStore: ensureFreeSpace(145112) called with curMem=0, maxMem=278302556
16/09/02 15:56:26 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 141.7 KB, free 265.3 MB)
16/09/02 15:56:26 DEBUG BlockManager: Put block broadcast_0 locally took  162 ms
16/09/02 15:56:26 DEBUG BlockManager: Putting block broadcast_0 without replication took  163 ms
16/09/02 15:56:26 INFO MemoryStore: ensureFreeSpace(13080) called with curMem=145112, maxMem=278302556
16/09/02 15:56:26 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 12.8 KB, free 265.3 MB)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_0_piece0,StorageLevel(false, true, false, false, 1),13080,0,0),true) from Actor[akka://sparkDriver/temp/$b]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_0_piece0,StorageLevel(false, true, false, false, 1),13080,0,0),true)
16/09/02 15:56:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.1.136:53720 (size: 12.8 KB, free: 265.4 MB)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (2.631 ms) AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_0_piece0,StorageLevel(false, true, false, false, 1),13080,0,0),true) from Actor[akka://sparkDriver/temp/$b]
16/09/02 15:56:26 DEBUG BlockManagerMaster: Updated info of block broadcast_0_piece0
16/09/02 15:56:26 DEBUG BlockManager: Told master about block broadcast_0_piece0
16/09/02 15:56:26 DEBUG BlockManager: Put block broadcast_0_piece0 locally took  11 ms
16/09/02 15:56:26 DEBUG BlockManager: Putting block broadcast_0_piece0 without replication took  11 ms
16/09/02 15:56:26 INFO SparkContext: Created broadcast 0 from textFile at ScalaWordCount.scala:41
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function1> (org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final org.apache.spark.SparkContext$$anonfun$hadoopFile$1 org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32.$outer
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32.apply(java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final void org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32.apply(org.apache.hadoop.mapred.JobConf)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.SparkContext$$anonfun$hadoopFile$1
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.SparkContext
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      <function0>
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.SparkContext@3d3a1232
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      (class org.apache.spark.SparkContext,Set())
16/09/02 15:56:26 DEBUG ClosureCleaner:      (class org.apache.spark.SparkContext$$anonfun$hadoopFile$1,Set(path$6))
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outermost object is not a closure, so do not clone it: (class org.apache.spark.SparkContext,org.apache.spark.SparkContext@3d3a1232)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + cloning the object <function0> of class org.apache.spark.SparkContext$$anonfun$hadoopFile$1
16/09/02 15:56:26 DEBUG ClosureCleaner:  + cleaning cloned closure <function0> recursively (org.apache.spark.SparkContext$$anonfun$hadoopFile$1)
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function0> (org.apache.spark.SparkContext$$anonfun$hadoopFile$1}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 7
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.SparkContext$$anonfun$hadoopFile$1.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final org.apache.spark.SparkContext org.apache.spark.SparkContext$$anonfun$hadoopFile$1.$outer
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.String org.apache.spark.SparkContext$$anonfun$hadoopFile$1.path$6
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final java.lang.Class org.apache.spark.SparkContext$$anonfun$hadoopFile$1.inputFormatClass$1
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final java.lang.Class org.apache.spark.SparkContext$$anonfun$hadoopFile$1.keyClass$1
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final java.lang.Class org.apache.spark.SparkContext$$anonfun$hadoopFile$1.valueClass$1
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final int org.apache.spark.SparkContext$$anonfun$hadoopFile$1.minPartitions$3
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply()
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final org.apache.spark.rdd.HadoopRDD org.apache.spark.SparkContext$$anonfun$hadoopFile$1.apply()
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.SparkContext
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.SparkContext@3d3a1232
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      (class org.apache.spark.SparkContext,Set())
16/09/02 15:56:26 DEBUG ClosureCleaner:      (class org.apache.spark.SparkContext$$anonfun$hadoopFile$1,Set(path$6))
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outermost object is not a closure, so do not clone it: (class org.apache.spark.SparkContext,org.apache.spark.SparkContext@3d3a1232)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + the starting closure doesn't actually need org.apache.spark.SparkContext@3d3a1232, so we null it out
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function0> (org.apache.spark.SparkContext$$anonfun$hadoopFile$1) is now cleaned +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function1> (org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32) is now cleaned +++
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function1> (org.apache.spark.SparkContext$$anonfun$textFile$1$$anonfun$apply$9}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.SparkContext$$anonfun$textFile$1$$anonfun$apply$9.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.SparkContext$$anonfun$textFile$1$$anonfun$apply$9.apply(java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.String org.apache.spark.SparkContext$$anonfun$textFile$1$$anonfun$apply$9.apply(scala.Tuple2)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + there are no enclosing objects!
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function1> (org.apache.spark.SparkContext$$anonfun$textFile$1$$anonfun$apply$9) is now cleaned +++
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function1> (org.apache.spark.examples.ScalaWordCount$$anonfun$2}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.examples.ScalaWordCount$$anonfun$2.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.examples.ScalaWordCount$$anonfun$2.apply(java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final scala.collection.mutable.ArrayOps org.apache.spark.examples.ScalaWordCount$$anonfun$2.apply(java.lang.String)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + there are no enclosing objects!
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function1> (org.apache.spark.examples.ScalaWordCount$$anonfun$2) is now cleaned +++
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function1> (org.apache.spark.examples.ScalaWordCount$$anonfun$3}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.examples.ScalaWordCount$$anonfun$3.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.examples.ScalaWordCount$$anonfun$3.apply(java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final scala.Tuple2 org.apache.spark.examples.ScalaWordCount$$anonfun$3.apply(java.lang.String)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + there are no enclosing objects!
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function1> (org.apache.spark.examples.ScalaWordCount$$anonfun$3) is now cleaned +++
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
-dep MapPartitionsRDD[2] at flatMap at ScalaWordCount.scala:49,cur MapPartitionsRDD[3] at map at ScalaWordCount.scala:50
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
-dep MapPartitionsRDD[1] at textFile at ScalaWordCount.scala:41,cur MapPartitionsRDD[2] at flatMap at ScalaWordCount.scala:49
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
-dep ../spark-1.4.1/examples/src/main/resources/CHANGES.txt HadoopRDD[0] at textFile at ScalaWordCount.scala:41,cur MapPartitionsRDD[1] at textFile at ScalaWordCount.scala:41
16/09/02 15:56:26 DEBUG BlockManager: Getting local block broadcast_0
16/09/02 15:56:26 DEBUG BlockManager: Level for block broadcast_0 is StorageLevel(true, true, false, true, 1)
16/09/02 15:56:26 INFO BlockManager: Getting block broadcast_0 from memory
16/09/02 15:56:26 DEBUG HadoopRDD: Creating new JobConf and caching it for later re-use
16/09/02 15:56:26 DEBUG FileInputFormat: Time taken to get FileStatuses: 36
16/09/02 15:56:26 INFO FileInputFormat: Total input paths to process : 1
16/09/02 15:56:26 DEBUG FileInputFormat: Total # of splits generated by getSplits: 2, TimeTaken: 49
-rdd MapPartitionsRDD[3] at map at ScalaWordCount.scala:50,num partitions 2
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function1> (org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$1$$anonfun$apply$14}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$1$$anonfun$apply$14.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$1$$anonfun$apply$14.apply(java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + there are no enclosing objects!
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function1> (org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$1$$anonfun$apply$14) is now cleaned +++
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function2> (org.apache.spark.examples.ScalaWordCount$$anonfun$1}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.examples.ScalaWordCount$$anonfun$1.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 3
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final int org.apache.spark.examples.ScalaWordCount$$anonfun$1.apply(int,int)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.examples.ScalaWordCount$$anonfun$1.apply(java.lang.Object,java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public int org.apache.spark.examples.ScalaWordCount$$anonfun$1.apply$mcIII$sp(int,int)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + there are no enclosing objects!
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function2> (org.apache.spark.examples.ScalaWordCount$$anonfun$1) is now cleaned +++
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function2> (org.apache.spark.examples.ScalaWordCount$$anonfun$1}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.examples.ScalaWordCount$$anonfun$1.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 3
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final int org.apache.spark.examples.ScalaWordCount$$anonfun$1.apply(int,int)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.examples.ScalaWordCount$$anonfun$1.apply(java.lang.Object,java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public int org.apache.spark.examples.ScalaWordCount$$anonfun$1.apply$mcIII$sp(int,int)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + there are no enclosing objects!
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function2> (org.apache.spark.examples.ScalaWordCount$$anonfun$1) is now cleaned +++
16/09/02 15:56:26 INFO PairRDDFunctions: self MapPartitionsRDD[3] at map at ScalaWordCount.scala:50,self.partitioner None,new one org.apache.spark.HashPartitioner@2
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function1> (org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final org.apache.spark.rdd.RDD$$anonfun$collect$1 org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.$outer
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(scala.collection.Iterator)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.rdd.RDD$$anonfun$collect$1
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.rdd.RDD
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      <function0>
16/09/02 15:56:26 DEBUG ClosureCleaner:      ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      (class org.apache.spark.rdd.RDD,Set(org$apache$spark$rdd$RDD$$evidence$1))
16/09/02 15:56:26 DEBUG ClosureCleaner:      (class org.apache.spark.rdd.RDD$$anonfun$collect$1,Set($outer))
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outermost object is not a closure, so do not clone it: (class org.apache.spark.rdd.RDD,ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + cloning the object <function0> of class org.apache.spark.rdd.RDD$$anonfun$collect$1
16/09/02 15:56:26 DEBUG ClosureCleaner:  + cleaning cloned closure <function0> recursively (org.apache.spark.rdd.RDD$$anonfun$collect$1)
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function0> (org.apache.spark.rdd.RDD$$anonfun$collect$1}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.rdd.RDD$$anonfun$collect$1.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD$$anonfun$collect$1.$outer
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.rdd.RDD$$anonfun$collect$1.apply()
16/09/02 15:56:26 DEBUG ClosureCleaner:      public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD$$anonfun$collect$1.org$apache$spark$rdd$RDD$$anonfun$$$outer()
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      org.apache.spark.rdd.RDD
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 1
16/09/02 15:56:26 DEBUG ClosureCleaner:      ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      (class org.apache.spark.rdd.RDD,Set(org$apache$spark$rdd$RDD$$evidence$1))
16/09/02 15:56:26 DEBUG ClosureCleaner:      (class org.apache.spark.rdd.RDD$$anonfun$collect$1,Set($outer))
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outermost object is not a closure, so do not clone it: (class org.apache.spark.rdd.RDD,ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52)
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function0> (org.apache.spark.rdd.RDD$$anonfun$collect$1) is now cleaned +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function1> (org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12) is now cleaned +++
16/09/02 15:56:26 DEBUG ClosureCleaner: +++ Cleaning closure <function2> (org.apache.spark.SparkContext$$anonfun$runJob$5}) +++
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared fields: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public static final long org.apache.spark.SparkContext$$anonfun$runJob$5.serialVersionUID
16/09/02 15:56:26 DEBUG ClosureCleaner:      private final scala.Function1 org.apache.spark.SparkContext$$anonfun$runJob$5.cleanedFunc$1
16/09/02 15:56:26 DEBUG ClosureCleaner:  + declared methods: 2
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.SparkContext$$anonfun$runJob$5.apply(java.lang.Object,java.lang.Object)
16/09/02 15:56:26 DEBUG ClosureCleaner:      public final java.lang.Object org.apache.spark.SparkContext$$anonfun$runJob$5.apply(org.apache.spark.TaskContext,scala.collection.Iterator)
16/09/02 15:56:26 DEBUG ClosureCleaner:  + inner classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer classes: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + outer objects: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + populating accessed fields because this is the starting closure
16/09/02 15:56:26 DEBUG ClosureCleaner:  + fields accessed by starting closure: 0
16/09/02 15:56:26 DEBUG ClosureCleaner:  + there are no enclosing objects!
16/09/02 15:56:26 DEBUG ClosureCleaner:  +++ closure <function2> (org.apache.spark.SparkContext$$anonfun$runJob$5) is now cleaned +++
16/09/02 15:56:26 INFO SparkContext: Starting job: collect at ScalaWordCount.scala:63,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$c]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetStorageStatus,true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (8.975 ms) AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$c]
16/09/02 15:56:26 INFO ShuffledRDD: -checkpoingrdd None
-dep MapPartitionsRDD[3] at map at ScalaWordCount.scala:50,cur ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:26 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$d]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetStorageStatus,true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.725 ms) AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$d]
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$e]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetStorageStatus,true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.608 ms) AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$e]
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$f]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetStorageStatus,true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.617 ms) AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$f]
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$g]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetStorageStatus,true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.589 ms) AkkaMessage(GetStorageStatus,true) from Actor[akka://sparkDriver/temp/$g]
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO SparkContext: RDD's recursive dependencies:
(2) ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52 []
 +-(2) MapPartitionsRDD[3] at map at ScalaWordCount.scala:50 []
    |  MapPartitionsRDD[2] at flatMap at ScalaWordCount.scala:49 []
    |  MapPartitionsRDD[1] at textFile at ScalaWordCount.scala:41 []
    |  ../spark-1.4.1/examples/src/main/resources/CHANGES.txt HadoopRDD[0] at textFile at ScalaWordCount.scala:41 []
16/09/02 15:56:26 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO DAGScheduler: Registering RDD 3 (map at ScalaWordCount.scala:50)
16/09/02 15:56:26 INFO DAGScheduler: -no mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:26 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:26 INFO DAGScheduler: Got job 0 (collect at ScalaWordCount.scala:63) with 2 output partitions (allowLocal=false)
16/09/02 15:56:26 INFO DAGScheduler: Final stage: ResultStage 1(collect at ScalaWordCount.scala:63),rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:26 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@656b81ac),true) from Actor[akka://sparkDriver/temp/$h]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@656b81ac),true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.279 ms) AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@656b81ac),true) from Actor[akka://sparkDriver/temp/$h]
16/09/02 15:56:26 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:26 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:26 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:26 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
16/09/02 15:56:26 INFO DAGScheduler: *submitStage(ResultStage 1)
16/09/02 15:56:26 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:26 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:26 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:26 INFO DAGScheduler: -*missing: List(ShuffleMapStage 0)
16/09/02 15:56:26 INFO DAGScheduler: *submitStage(ShuffleMapStage 0)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@441cabfd),true) from Actor[akka://sparkDriver/temp/$i]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@441cabfd),true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (2.331 ms) AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@441cabfd),true) from Actor[akka://sparkDriver/temp/$i]
16/09/02 15:56:26 INFO DAGScheduler: -cachelocs contains nil,rdd MapPartitionsRDD[3] at map at ScalaWordCount.scala:50
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO DAGScheduler: --dep org.apache.spark.OneToOneDependency@14c54b12 of stage ShuffleMapStage 0
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@7b083f3b),true) from Actor[akka://sparkDriver/temp/$j]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@7b083f3b),true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (2.103 ms) AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@7b083f3b),true) from Actor[akka://sparkDriver/temp/$j]
16/09/02 15:56:26 INFO DAGScheduler: -cachelocs contains nil,rdd MapPartitionsRDD[2] at flatMap at ScalaWordCount.scala:49
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO DAGScheduler: --dep org.apache.spark.OneToOneDependency@260d739f of stage ShuffleMapStage 0
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@367f571d),true) from Actor[akka://sparkDriver/temp/$k]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@367f571d),true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.776 ms) AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@367f571d),true) from Actor[akka://sparkDriver/temp/$k]
16/09/02 15:56:26 INFO DAGScheduler: -cachelocs contains nil,rdd MapPartitionsRDD[1] at textFile at ScalaWordCount.scala:41
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO DAGScheduler: --dep org.apache.spark.OneToOneDependency@d36489f of stage ShuffleMapStage 0
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@26deb73b),true) from Actor[akka://sparkDriver/temp/$l]
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@26deb73b),true)
16/09/02 15:56:26 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.632 ms) AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@26deb73b),true) from Actor[akka://sparkDriver/temp/$l]
16/09/02 15:56:26 INFO DAGScheduler: -cachelocs contains nil,rdd ../spark-1.4.1/examples/src/main/resources/CHANGES.txt HadoopRDD[0] at textFile at ScalaWordCount.scala:41
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO DAGScheduler: -*missing: List()
16/09/02 15:56:26 INFO DAGScheduler: --*Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at map at ScalaWordCount.scala:50), which has no missing parents
16/09/02 15:56:26 DEBUG DAGScheduler: submitMissingTasks(ShuffleMapStage 0)
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:26 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO MemoryStore: ensureFreeSpace(4200) called with curMem=158192, maxMem=278302556
16/09/02 15:56:27 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.1 KB, free 265.3 MB)
16/09/02 15:56:27 DEBUG BlockManager: Put block broadcast_1 locally took  2 ms
16/09/02 15:56:27 DEBUG BlockManager: Putting block broadcast_1 without replication took  3 ms
16/09/02 15:56:27 INFO MemoryStore: ensureFreeSpace(2337) called with curMem=162392, maxMem=278302556
16/09/02 15:56:27 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.3 KB, free 265.3 MB)
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_1_piece0,StorageLevel(false, true, false, false, 1),2337,0,0),true) from Actor[akka://sparkDriver/temp/$m]
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_1_piece0,StorageLevel(false, true, false, false, 1),2337,0,0),true)
16/09/02 15:56:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.1.136:53720 (size: 2.3 KB, free: 265.4 MB)
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (6.775 ms) AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_1_piece0,StorageLevel(false, true, false, false, 1),2337,0,0),true) from Actor[akka://sparkDriver/temp/$m]
16/09/02 15:56:27 DEBUG BlockManagerMaster: Updated info of block broadcast_1_piece0
16/09/02 15:56:27 DEBUG BlockManager: Told master about block broadcast_1_piece0
16/09/02 15:56:27 DEBUG BlockManager: Put block broadcast_1_piece0 locally took  9 ms
16/09/02 15:56:27 DEBUG BlockManager: Putting block broadcast_1_piece0 without replication took  11 ms
16/09/02 15:56:27 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:886
16/09/02 15:56:27 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO DAGScheduler: Submitting 2 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at map at ScalaWordCount.scala:50)
16/09/02 15:56:27 DEBUG DAGScheduler: New pending tasks: Set(ShuffleMapTask(0, 1), ShuffleMapTask(0, 0))
16/09/02 15:56:27 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
16/09/02 15:56:27 DEBUG TaskSetManager: Epoch for TaskSet 0.0: 0
16/09/02 15:56:27 INFO TaskSetManager: Valid locality levels for TaskSet 0.0: NO_PREF, ANY
16/09/02 15:56:27 INFO TaskSchedulerImpl: -is local:false,hasReceivedTask:false
16/09/02 15:56:27 INFO TaskSchedulerImpl: -finished
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:27 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:27 INFO DAGScheduler: -running: Set(ShuffleMapStage 0)
16/09/02 15:56:27 INFO DAGScheduler: -waiting: Set(ResultStage 1)
16/09/02 15:56:27 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:27 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
16/09/02 15:56:27 INFO DAGScheduler: --stage to submit:ResultStage 1
16/09/02 15:56:27 INFO DAGScheduler: *submitStage(ResultStage 1)
16/09/02 15:56:27 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:27 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:27 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:27 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:27 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:27 INFO DAGScheduler: -*missing: List(ShuffleMapStage 0)
16/09/02 15:56:27 INFO DAGScheduler: *submitStage(ShuffleMapStage 0)
16/09/02 15:56:27 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (20.188 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:27 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
16/09/02 15:56:27 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:27 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:27 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.864 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:28 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
16/09/02 15:56:28 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:28 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.708 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AppClient$ClientActor: Received unexpected actor system event: Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.02 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.019 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.023 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.034 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.023 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.019 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(RetrieveSparkProps,true) from Actor[akka.tcp://driverPropsFetcher@192.168.1.136:53721/temp/$b]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(RetrieveSparkProps,true)
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (3.792 ms) AkkaMessage(RetrieveSparkProps,true) from Actor[akka.tcp://driverPropsFetcher@192.168.1.136:53721/temp/$b]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.204 ms) AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (6.206 ms) AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AppClient$ClientActor: Received unexpected actor system event: AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.058 ms) AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.108 ms) AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.745 ms) AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.126 ms) AssociationError [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]: Error [Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721] [
akka.remote.ShutDownAssociation: Shut down address: akka.tcp://driverPropsFetcher@192.168.1.136:53721
Caused by: akka.remote.transport.Transport$InvalidAssociationException: The remote system terminated the association because it is shutting down.
] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.044 ms) Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.038 ms) Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.033 ms) Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.058 ms) Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AppClient$ClientActor: Received unexpected actor system event: Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.057 ms) Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.66 ms) Disassociated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://driverPropsFetcher@192.168.1.136:53721] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.021 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.019 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.021 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AppClient$ClientActor: Received unexpected actor system event: Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.03 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.02 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:28 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.056 ms) Associated [akka.tcp://sparkDriver@192.168.1.136:53712] <- [akka.tcp://sparkExecutor@192.168.1.136:53723] from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:29 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
16/09/02 15:56:29 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:29 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.735 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:29 DEBUG Worker: Received unexpected actor system event: Associated [akka.tcp://sparkWorker1@192.168.1.136:53716] <- [akka.tcp://sparkExecutor@192.168.1.136:53723]
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(RegisterExecutor(0,AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/user/Executor#2125883683]),192.168.1.136:53723,1,Map(stdout -> http://192.168.1.136:53717/logPage/?appId=app-20160902155625-0000&executorId=0&logType=stdout, stderr -> http://192.168.1.136:53717/logPage/?appId=app-20160902155625-0000&executorId=0&logType=stderr)),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$f]
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(RegisterExecutor(0,AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/user/Executor#2125883683]),192.168.1.136:53723,1,Map(stdout -> http://192.168.1.136:53717/logPage/?appId=app-20160902155625-0000&executorId=0&logType=stdout, stderr -> http://192.168.1.136:53717/logPage/?appId=app-20160902155625-0000&executorId=0&logType=stderr)),true)
16/09/02 15:56:29 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/user/Executor#2125883683]) with ID 0
16/09/02 15:56:29 INFO TaskSchedulerImpl: -found new executor to add
16/09/02 15:56:29 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
16/09/02 15:56:29 INFO TaskSetManager: Valid locality levels for TaskSet 0.0: NO_PREF, ANY
16/09/02 15:56:29 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:29 INFO DAGScheduler: -new executor added,id 0,host 192.168.1.136
16/09/02 15:56:29 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:29 INFO DAGScheduler: -running: Set(ShuffleMapStage 0)
16/09/02 15:56:29 INFO DAGScheduler: -waiting: Set(ResultStage 1)
16/09/02 15:56:29 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:29 INFO DAGScheduler: --stage to submit:ResultStage 1
16/09/02 15:56:29 INFO DAGScheduler: *submitStage(ResultStage 1)
16/09/02 15:56:29 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:29 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:29 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:29 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:29 INFO DAGScheduler: -*missing: List(ShuffleMapStage 0)
16/09/02 15:56:29 INFO DAGScheduler: *submitStage(ShuffleMapStage 0)
16/09/02 15:56:29 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.1.136, PROCESS_LOCAL, 1524 bytes)
16/09/02 15:56:29 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:29 INFO DAGScheduler: -running: Set(ShuffleMapStage 0)
16/09/02 15:56:29 INFO DAGScheduler: -waiting: Set(ResultStage 1)
16/09/02 15:56:29 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:29 INFO DAGScheduler: --stage to submit:ResultStage 1
16/09/02 15:56:29 INFO DAGScheduler: *submitStage(ResultStage 1)
16/09/02 15:56:29 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask true,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:29 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:29 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:29 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:29 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:29 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:29 INFO DAGScheduler: -*missing: List(ShuffleMapStage 0)
16/09/02 15:56:29 INFO DAGScheduler: *submitStage(ShuffleMapStage 0)
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (52.425 ms) AkkaMessage(RegisterExecutor(0,AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/user/Executor#2125883683]),192.168.1.136:53723,1,Map(stdout -> http://192.168.1.136:53717/logPage/?appId=app-20160902155625-0000&executorId=0&logType=stdout, stderr -> http://192.168.1.136:53717/logPage/?appId=app-20160902155625-0000&executorId=0&logType=stderr)),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$f]
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(RegisterBlockManager(BlockManagerId(0, 192.168.1.136, 53726),278302556,AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/user/BlockManagerEndpoint1#-1075081573])),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$g]
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(RegisterBlockManager(BlockManagerId(0, 192.168.1.136, 53726),278302556,AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/user/BlockManagerEndpoint1#-1075081573])),true)
16/09/02 15:56:29 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.136:53726 with 265.4 MB RAM, BlockManagerId(0, 192.168.1.136, 53726)
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.011 ms) AkkaMessage(RegisterBlockManager(BlockManagerId(0, 192.168.1.136, 53726),278302556,AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/user/BlockManagerEndpoint1#-1075081573])),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$g]
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StatusUpdate(0,0,RUNNING,org.apache.spark.util.SerializableBuffer@19d00552),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StatusUpdate(0,0,RUNNING,org.apache.spark.util.SerializableBuffer@19d00552),false)
16/09/02 15:56:29 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.981 ms) AkkaMessage(StatusUpdate(0,0,RUNNING,org.apache.spark.util.SerializableBuffer@19d00552),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:30 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 1
16/09/02 15:56:30 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:30 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.155 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocations(broadcast_1_piece0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$i]
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocations(broadcast_1_piece0),true)
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.411 ms) AkkaMessage(GetLocations(broadcast_1_piece0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$i]
16/09/02 15:56:30 DEBUG ResourceLeakDetector: -Dio.netty.leakDetectionLevel: simple
16/09/02 15:56:30 DEBUG Recycler: -Dio.netty.recycler.maxCapacity.default: 262144
16/09/02 15:56:30 INFO BlockManager: -getblockdata,shuffle?false,blockid broadcast_1_piece0
16/09/02 15:56:30 DEBUG BlockManager: Level for block broadcast_1_piece0 is StorageLevel(true, true, false, false, 1)
16/09/02 15:56:30 INFO BlockManager: Getting block broadcast_1_piece0 from memory
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_1_piece0,StorageLevel(false, true, false, false, 1),2337,0,0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$j]
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_1_piece0,StorageLevel(false, true, false, false, 1),2337,0,0),true)
16/09/02 15:56:30 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.1.136:53726 (size: 2.3 KB, free: 265.4 MB)
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.164 ms) AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_1_piece0,StorageLevel(false, true, false, false, 1),2337,0,0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$j]
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocations(broadcast_0_piece0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$k]
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocations(broadcast_0_piece0),true)
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.527 ms) AkkaMessage(GetLocations(broadcast_0_piece0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$k]
16/09/02 15:56:30 INFO BlockManager: -getblockdata,shuffle?false,blockid broadcast_0_piece0
16/09/02 15:56:30 DEBUG BlockManager: Level for block broadcast_0_piece0 is StorageLevel(true, true, false, false, 1)
16/09/02 15:56:30 INFO BlockManager: Getting block broadcast_0_piece0 from memory
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_0_piece0,StorageLevel(false, true, false, false, 1),13080,0,0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$l]
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_0_piece0,StorageLevel(false, true, false, false, 1),13080,0,0),true)
16/09/02 15:56:30 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.1.136:53726 (size: 12.8 KB, free: 265.4 MB)
16/09/02 15:56:30 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.047 ms) AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_0_piece0,StorageLevel(false, true, false, false, 1),13080,0,0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$l]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:31 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 1
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.854 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StatusUpdate(0,0,FINISHED,org.apache.spark.util.SerializableBuffer@388351fd),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StatusUpdate(0,0,FINISHED,org.apache.spark.util.SerializableBuffer@388351fd),false)
16/09/02 15:56:31 INFO TaskSetManager: -removing task id 0,parent org.apache.spark.scheduler.Pool@15d43453
16/09/02 15:56:31 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:31 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 192.168.1.136, PROCESS_LOCAL, 1524 bytes)
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask true,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:31 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:31 INFO DAGScheduler: -running: Set(ShuffleMapStage 0)
16/09/02 15:56:31 INFO DAGScheduler: -waiting: Set(ResultStage 1)
16/09/02 15:56:31 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (3.193 ms) AkkaMessage(StatusUpdate(0,0,FINISHED,org.apache.spark.util.SerializableBuffer@388351fd),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 INFO DAGScheduler: --stage to submit:ResultStage 1
16/09/02 15:56:31 INFO DAGScheduler: *submitStage(ResultStage 1)
16/09/02 15:56:31 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:31 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:31 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:31 INFO DAGScheduler: -*missing: List(ShuffleMapStage 0)
16/09/02 15:56:31 INFO DAGScheduler: *submitStage(ShuffleMapStage 0)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StatusUpdate(0,1,RUNNING,org.apache.spark.util.SerializableBuffer@7cd4be04),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StatusUpdate(0,1,RUNNING,org.apache.spark.util.SerializableBuffer@7cd4be04),false)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.306 ms) AkkaMessage(StatusUpdate(0,1,RUNNING,org.apache.spark.util.SerializableBuffer@7cd4be04),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 INFO TaskResultGetter: -deserialized result from tid 0
16/09/02 15:56:31 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 2473 ms on 192.168.1.136 (1/2)
16/09/02 15:56:31 INFO DAGScheduler: -start to handle task completetion event,task:ShuffleMapTask(0, 0),stage:0,tasktype:ShuffleMapTask
16/09/02 15:56:31 INFO DAGScheduler: ShuffleMapTask finished on 0
16/09/02 15:56:31 INFO DAGScheduler: *finished handle task completed event
16/09/02 15:56:31 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:31 INFO DAGScheduler: -running: Set(ShuffleMapStage 0)
16/09/02 15:56:31 INFO DAGScheduler: -waiting: Set(ResultStage 1)
16/09/02 15:56:31 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:31 INFO DAGScheduler: --stage to submit:ResultStage 1
16/09/02 15:56:31 INFO DAGScheduler: *submitStage(ResultStage 1)
16/09/02 15:56:31 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:31 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:31 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:31 INFO DAGScheduler: -*missing: List(ShuffleMapStage 0)
16/09/02 15:56:31 INFO DAGScheduler: *submitStage(ShuffleMapStage 0)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StatusUpdate(0,1,FINISHED,org.apache.spark.util.SerializableBuffer@7cf4f8b5),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StatusUpdate(0,1,FINISHED,org.apache.spark.util.SerializableBuffer@7cf4f8b5),false)
16/09/02 15:56:31 INFO TaskSetManager: -removing task id 1,parent org.apache.spark.scheduler.Pool@15d43453
16/09/02 15:56:31 INFO TaskSchedulerImpl: parentName: , name: TaskSet_0, runningTasks: 0
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@5f767836
16/09/02 15:56:31 INFO TaskResultGetter: -deserialized result from tid 1
16/09/02 15:56:31 DEBUG TaskSetManager: No tasks for locality level NO_PREF, so moving to locality level ANY
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (5.768 ms) AkkaMessage(StatusUpdate(0,1,FINISHED,org.apache.spark.util.SerializableBuffer@7cf4f8b5),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 186 ms on 192.168.1.136 (2/2)
16/09/02 15:56:31 INFO DAGScheduler: -start to handle task completetion event,task:ShuffleMapTask(0, 1),stage:0,tasktype:ShuffleMapTask
16/09/02 15:56:31 INFO DAGScheduler: ShuffleMapTask finished on 0
16/09/02 15:56:31 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
16/09/02 15:56:31 INFO DAGScheduler: ShuffleMapStage 0 (map at ScalaWordCount.scala:50) finished in 4.748 s
16/09/02 15:56:31 INFO DAGScheduler: looking for newly runnable stages
16/09/02 15:56:31 INFO DAGScheduler: running: Set()
16/09/02 15:56:31 INFO DAGScheduler: waiting: Set(ResultStage 1)
16/09/02 15:56:31 INFO DAGScheduler: failed: Set()
16/09/02 15:56:31 DEBUG MapOutputTrackerMaster: Increasing epoch to 1
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@6a86c656),true) from Actor[akka://sparkDriver/temp/$n]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@6a86c656),true)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.533 ms) AkkaMessage(GetLocationsMultipleBlockIds([Lorg.apache.spark.storage.BlockId;@6a86c656),true) from Actor[akka://sparkDriver/temp/$n]
16/09/02 15:56:31 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:31 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:31 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:31 INFO DAGScheduler: Missing parents for ResultStage 1: List(),rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO DAGScheduler: -cachelocs contains nil,rdd ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:31 INFO DAGScheduler: --dep org.apache.spark.ShuffleDependency@6f3348c1 of stage ResultStage 1
16/09/02 15:56:31 INFO DAGScheduler: -mapstage exists,jobid 0,shuffleid 0
16/09/02 15:56:31 INFO DAGScheduler: -swapped running stages Set(ResultStage 1),waitingStages:Set()
16/09/02 15:56:31 INFO DAGScheduler: Submitting ResultStage 1 (ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52), which is now runnable in jobid 0
16/09/02 15:56:31 DEBUG DAGScheduler: submitMissingTasks(ResultStage 1)
16/09/02 15:56:31 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:31 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:31 INFO MemoryStore: ensureFreeSpace(2304) called with curMem=164729, maxMem=278302556
16/09/02 15:56:31 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 2.3 KB, free 265.3 MB)
16/09/02 15:56:31 DEBUG BlockManager: Put block broadcast_2 locally took  2 ms
16/09/02 15:56:31 DEBUG BlockManager: Putting block broadcast_2 without replication took  2 ms
16/09/02 15:56:31 INFO MemoryStore: ensureFreeSpace(1393) called with curMem=167033, maxMem=278302556
16/09/02 15:56:31 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1393.0 B, free 265.2 MB)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_2_piece0,StorageLevel(false, true, false, false, 1),1393,0,0),true) from Actor[akka://sparkDriver/temp/$o]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_2_piece0,StorageLevel(false, true, false, false, 1),1393,0,0),true)
16/09/02 15:56:31 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.1.136:53720 (size: 1393.0 B, free: 265.4 MB)
16/09/02 15:56:31 DEBUG BlockManagerMaster: Updated info of block broadcast_2_piece0
16/09/02 15:56:31 DEBUG BlockManager: Told master about block broadcast_2_piece0
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.91 ms) AkkaMessage(UpdateBlockInfo(BlockManagerId(driver, 192.168.1.136, 53720),broadcast_2_piece0,StorageLevel(false, true, false, false, 1),1393,0,0),true) from Actor[akka://sparkDriver/temp/$o]
16/09/02 15:56:31 DEBUG BlockManager: Put block broadcast_2_piece0 locally took  2 ms
16/09/02 15:56:31 DEBUG BlockManager: Putting block broadcast_2_piece0 without replication took  2 ms
16/09/02 15:56:31 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:886
16/09/02 15:56:31 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:31 INFO DAGScheduler: -part/rdd:org.apache.spark.rdd.ShuffledRDDPartition@0/ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO DAGScheduler: -part/rdd:org.apache.spark.rdd.ShuffledRDDPartition@1/ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:31 INFO DAGScheduler: -part/rdd:org.apache.spark.rdd.ShuffledRDDPartition@0/ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO DAGScheduler: -part/rdd:org.apache.spark.rdd.ShuffledRDDPartition@1/ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52
16/09/02 15:56:31 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 1 (ShuffledRDD[4] at reduceByKey at ScalaWordCount.scala:52)
16/09/02 15:56:31 DEBUG DAGScheduler: New pending tasks: Set(ResultTask(1, 1), ResultTask(1, 0))
16/09/02 15:56:31 INFO TaskSchedulerImpl: Adding task set 1.0 with 2 tasks
16/09/02 15:56:31 DEBUG TaskSetManager: Epoch for TaskSet 1.0: 1
16/09/02 15:56:31 INFO TaskSetManager: Valid locality levels for TaskSet 1.0: NO_PREF, ANY
16/09/02 15:56:31 INFO TaskSchedulerImpl: -is local:false,hasReceivedTask:true
16/09/02 15:56:31 INFO DAGScheduler: *finished handle task completed event
16/09/02 15:56:31 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 INFO DAGScheduler: -running: Set(ResultStage 1)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:31 INFO DAGScheduler: -waiting: Set()
16/09/02 15:56:31 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:31 INFO TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:31 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 2, 192.168.1.136, PROCESS_LOCAL, 1243 bytes)
16/09/02 15:56:31 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:31 INFO DAGScheduler: -running: Set(ResultStage 1)
16/09/02 15:56:31 INFO DAGScheduler: -waiting: Set()
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask true,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:31 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:31 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (3.364 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StatusUpdate(0,2,RUNNING,org.apache.spark.util.SerializableBuffer@101ffa12),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StatusUpdate(0,2,RUNNING,org.apache.spark.util.SerializableBuffer@101ffa12),false)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.418 ms) AkkaMessage(StatusUpdate(0,2,RUNNING,org.apache.spark.util.SerializableBuffer@101ffa12),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetLocations(broadcast_2_piece0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$m]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetLocations(broadcast_2_piece0),true)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.432 ms) AkkaMessage(GetLocations(broadcast_2_piece0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$m]
16/09/02 15:56:31 INFO BlockManager: -getblockdata,shuffle?false,blockid broadcast_2_piece0
16/09/02 15:56:31 DEBUG BlockManager: Level for block broadcast_2_piece0 is StorageLevel(true, true, false, false, 1)
16/09/02 15:56:31 INFO BlockManager: Getting block broadcast_2_piece0 from memory
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_2_piece0,StorageLevel(false, true, false, false, 1),1393,0,0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$n]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_2_piece0,StorageLevel(false, true, false, false, 1),1393,0,0),true)
16/09/02 15:56:31 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.1.136:53726 (size: 1393.0 B, free: 265.4 MB)
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.897 ms) AkkaMessage(UpdateBlockInfo(BlockManagerId(0, 192.168.1.136, 53726),broadcast_2_piece0,StorageLevel(false, true, false, false, 1),1393,0,0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$n]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(GetMapOutputStatuses(0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$o]
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(GetMapOutputStatuses(0),true)
16/09/02 15:56:31 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 192.168.1.136:53723
16/09/02 15:56:31 INFO MapOutputTrackerMaster: Size of output statuses for shuffle 0 is 151 bytes
16/09/02 15:56:31 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (5.756 ms) AkkaMessage(GetMapOutputStatuses(0),true) from Actor[akka.tcp://sparkExecutor@192.168.1.136:53723/temp/$o]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(ReviveOffers,false)
16/09/02 15:56:32 INFO TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 1
16/09/02 15:56:32 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:32 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.297 ms) AkkaMessage(ReviveOffers,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StatusUpdate(0,2,FINISHED,org.apache.spark.util.SerializableBuffer@2ba0eb66),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StatusUpdate(0,2,FINISHED,org.apache.spark.util.SerializableBuffer@2ba0eb66),false)
16/09/02 15:56:32 INFO TaskSetManager: -removing task id 2,parent org.apache.spark.scheduler.Pool@15d43453
16/09/02 15:56:32 INFO TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
16/09/02 15:56:32 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:32 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 3, 192.168.1.136, PROCESS_LOCAL, 1243 bytes)
16/09/02 15:56:32 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask true,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:32 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (2.634 ms) AkkaMessage(StatusUpdate(0,2,FINISHED,org.apache.spark.util.SerializableBuffer@2ba0eb66),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:32 INFO DAGScheduler: -running: Set(ResultStage 1)
16/09/02 15:56:32 INFO DAGScheduler: -waiting: Set()
16/09/02 15:56:32 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:32 INFO TaskResultGetter: -deserialized result from tid 2
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StatusUpdate(0,3,RUNNING,org.apache.spark.util.SerializableBuffer@a26cd3b),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StatusUpdate(0,3,RUNNING,org.apache.spark.util.SerializableBuffer@a26cd3b),false)
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.449 ms) AkkaMessage(StatusUpdate(0,3,RUNNING,org.apache.spark.util.SerializableBuffer@a26cd3b),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StatusUpdate(0,3,FINISHED,org.apache.spark.util.SerializableBuffer@795b6081),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StatusUpdate(0,3,FINISHED,org.apache.spark.util.SerializableBuffer@795b6081),false)
16/09/02 15:56:32 INFO TaskSetManager: -removing task id 3,parent org.apache.spark.scheduler.Pool@15d43453
16/09/02 15:56:32 INFO TaskSchedulerImpl: parentName: , name: TaskSet_1, runningTasks: 0
16/09/02 15:56:32 INFO TaskSchedulerImpl: -max locality NO_PREF,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:32 INFO TaskSchedulerImpl: -max locality ANY,launchedTask false,taskset org.apache.spark.scheduler.TaskSetManager@8c66f12
16/09/02 15:56:32 DEBUG TaskSetManager: No tasks for locality level NO_PREF, so moving to locality level ANY
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.592 ms) AkkaMessage(StatusUpdate(0,3,FINISHED,org.apache.spark.util.SerializableBuffer@795b6081),false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 INFO TaskResultGetter: -deserialized result from tid 3
16/09/02 15:56:32 INFO DAGScheduler: -start to handle task completetion event,task:ResultTask(1, 0),stage:1,tasktype:ResultTask
16/09/02 15:56:32 INFO DAGScheduler: *finished handle task completed event
16/09/02 15:56:32 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:32 INFO DAGScheduler: -running: Set(ResultStage 1)
16/09/02 15:56:32 INFO DAGScheduler: -waiting: Set()
16/09/02 15:56:32 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:32 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 2) in 705 ms on 192.168.1.136 (1/2)
16/09/02 15:56:32 INFO DAGScheduler: -start to handle task completetion event,task:ResultTask(1, 1),stage:1,tasktype:ResultTask
16/09/02 15:56:32 INFO TaskSetManager: Finished task 1.0 in stage 1.0 (TID 3) in 304 ms on 192.168.1.136 (2/2)
16/09/02 15:56:32 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
16/09/02 15:56:32 INFO DAGScheduler: ResultStage 1 (collect at ScalaWordCount.scala:63) finished in 0.743 s
16/09/02 15:56:32 INFO DAGScheduler: After removal of stage 1, remaining stages = 1
16/09/02 15:56:32 INFO DAGScheduler: After removal of stage 0, remaining stages = 0
16/09/02 15:56:32 INFO DAGScheduler: =finish this job, org.apache.spark.scheduler.ActiveJob@fedaff0,last task ResultTask(1, 1)
16/09/02 15:56:32 INFO DAGScheduler: *finished handle task completed event
16/09/02 15:56:32 INFO DAGScheduler: -Checking for newly runnable parent stages
16/09/02 15:56:32 INFO DAGScheduler: -running: Set()
16/09/02 15:56:32 INFO DAGScheduler: -waiting: Set()
16/09/02 15:56:32 INFO DAGScheduler: -failed: Set()
16/09/02 15:56:32 INFO DAGScheduler: Job 0 finished: collect at ScalaWordCount.scala:63, took 5.705449 s
16/09/02 15:56:32 INFO ShuffledRDD: -doCheckpointCalled false,checkpointData defined?false
16/09/02 15:56:32 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO ShuffledRDD: -checkpoingrdd None
-dep org.apache.spark.ShuffleDependency@6f3348c1,rdd MapPartitionsRDD[3] at map at ScalaWordCount.scala:50,size 1
16/09/02 15:56:32 INFO ShuffledRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO MapPartitionsRDD: -doCheckpointCalled false,checkpointData defined?false
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
-dep org.apache.spark.OneToOneDependency@14c54b12,rdd MapPartitionsRDD[2] at flatMap at ScalaWordCount.scala:49,size 1
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO MapPartitionsRDD: -doCheckpointCalled false,checkpointData defined?false
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
-dep org.apache.spark.OneToOneDependency@260d739f,rdd MapPartitionsRDD[1] at textFile at ScalaWordCount.scala:41,size 1
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO MapPartitionsRDD: -doCheckpointCalled false,checkpointData defined?false
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
-dep org.apache.spark.OneToOneDependency@d36489f,rdd ../spark-1.4.1/examples/src/main/resources/CHANGES.txt HadoopRDD[0] at textFile at ScalaWordCount.scala:41,size 1
16/09/02 15:56:32 INFO MapPartitionsRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO HadoopRDD: -doCheckpointCalled false,checkpointData defined?false
16/09/02 15:56:32 INFO HadoopRDD: -checkpoingrdd None
16/09/02 15:56:32 INFO HadoopRDD: -checkpoingrdd None
*reduce output to limit 10,found 16403
72df5a3,,1
17:44:08,1
d12c071,,1
7aa269f,,1
github.com/apache/spark/pull/3250,1
github.com/apache/spark/pull/3159,1
github.com/apache/spark/pull/4176,2
16:23:20,1
github.com/apache/spark/pull/6095,1
02:13:06,1
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/09/02 15:56:32 INFO SparkUI: Stopped Spark web UI at http://192.168.1.136:4040
16/09/02 15:56:32 INFO DAGScheduler: Stopping DAGScheduler
16/09/02 15:56:32 INFO SparkDeploySchedulerBackend: Shutting down all executors
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StopExecutors,true) from Actor[akka://sparkDriver/temp/$p]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StopExecutors,true)
16/09/02 15:56:32 INFO SparkDeploySchedulerBackend: Asking each executor to shut down
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.8 ms) AkkaMessage(StopExecutors,true) from Actor[akka://sparkDriver/temp/$p]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StopDriver,true) from Actor[akka://sparkDriver/temp/$q]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StopDriver,true)
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (0.572 ms) AkkaMessage(StopDriver,true) from Actor[akka://sparkDriver/temp/$q]
16/09/02 15:56:32 DEBUG AppClient$ClientActor: [actor] received message StopAppClient from Actor[akka://sparkDriver/temp/$r]
16/09/02 15:56:32 DEBUG AppClient$ClientActor: [actor] handled message (0.384 ms) StopAppClient from Actor[akka://sparkDriver/temp/$r]
16/09/02 15:56:32 INFO LocalSparkCluster: Shutting down local Spark cluster.
16/09/02 15:56:32 INFO ExecutorRunner: Runner thread for executor app-20160902155625-0000/0 interrupted
16/09/02 15:56:32 INFO ExecutorRunner: Killing process!
16/09/02 15:56:32 DEBUG FileAppender: Closed file /Users/userxx/Cloud/Spark/spark-1.4.1/work/app-20160902155625-0000/0/stdout
16/09/02 15:56:32 DEBUG FileAppender: Closed file /Users/userxx/Cloud/Spark/spark-1.4.1/work/app-20160902155625-0000/0/stderr
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/log,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/applications/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/master/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/driver/kill,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/app/kill,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/logPage/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/history/not-found/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/logPage,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/history/not-found,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/app/json,null}
16/09/02 15:56:32 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/app,null}
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StopMapOutputTracker,true) from Actor[akka://sparkDriver/temp/$s]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StopMapOutputTracker,true)
16/09/02 15:56:32 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (3.629 ms) AkkaMessage(StopMapOutputTracker,true) from Actor[akka://sparkDriver/temp/$s]
16/09/02 15:56:32 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/09/02 15:56:32 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/09/02 15:56:32 INFO Utils: path = /private/var/folders/rt/6f6nq06577vb3c0d8bskm97m0000gn/T/spark-4cad14d1-49d6-4130-a484-2d5c264f67ab/blockmgr-c6228943-b29b-4942-adb7-9a49b227348e, already present as root for deletion.
16/09/02 15:56:32 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/09/02 15:56:32 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/09/02 15:56:32 INFO MemoryStore: MemoryStore cleared
16/09/02 15:56:32 INFO BlockManager: BlockManager stopped
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StopBlockManagerMaster,true) from Actor[akka://sparkDriver/temp/$t]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StopBlockManagerMaster,true)
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (5.406 ms) AkkaMessage(StopBlockManagerMaster,true) from Actor[akka://sparkDriver/temp/$t]
16/09/02 15:56:32 INFO BlockManagerMaster: BlockManagerMaster stopped
16/09/02 15:56:32 INFO SparkContext: Successfully stopped SparkContext
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] received message AkkaMessage(StopCoordinator,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: Received RPC message: AkkaMessage(StopCoordinator,false)
16/09/02 15:56:32 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/09/02 15:56:32 DEBUG AkkaRpcEnv$$anonfun$actorRef$lzycompute$1$1$$anon$1: [actor] handled message (1.117 ms) AkkaMessage(StopCoordinator,false) from Actor[akka://sparkDriver/deadLetters]
16/09/02 15:56:32 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/09/02 15:56:32 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/09/02 15:56:32 INFO Utils: Shutdown hook called
16/09/02 15:56:32 INFO Utils: Deleting directory /private/var/folders/rt/6f6nq06577vb3c0d8bskm97m0000gn/T/spark-4cad14d1-49d6-4130-a484-2d5c264f67ab

 

  • 大小: 116.6 KB
0
0
分享到:
评论

相关推荐

Global site tag (gtag.js) - Google Analytics