`

[spark-src-core] 3.2.run spark in standalone(client) mode

 
阅读更多

1.startup command

./bin/spark-submit  --class org.apache.spark.examples.JavaWordCount --deploy-mode client --master spark://gzsw-02:7077 lib/spark-examples-1.4.1-hadoop2.4.0.jar hdfs://host02:/user/hadoop/input.txt

   note:1) the master is the cluster manager which stated in spark master ui page,ie,

URL: spark://gzsw-02:7077

    other than the 'REST URL xxxx' .

   2) the --deploy-mode is optional,ie it's the same effect as specified as above 'client'.

2.run logs

Spark Command: /usr/local/jdk/jdk1.6.0_31/bin/java -cp /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-assembly-1.4.1-hadoop2.4.0.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/usr/local/hadoop/hadoop-2.5.2/etc/hadoop/ -Xms6g -Xmx6g -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master spark://gzsw-02:7077 --deploy-mode client --class org.apache.spark.examples.JavaWordCount lib/spark-examples-1.4.1-hadoop2.4.0.jar hdfs://hd02:/user/hadoop/input.txt
========================================
-executed cmd retruned by Main.java:/usr/local/jdk/jdk1.6.0_31/bin/java -cp /home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/conf/:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-assembly-1.4.1-hadoop2.4.0.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar:/usr/local/hadoop/hadoop-2.5.2/etc/hadoop/ -Xms6g -Xmx6g -XX:MaxPermSize=256m org.apache.spark.deploy.SparkSubmit --master spark://gzsw-02:7077 --deploy-mode client --class org.apache.spark.examples.JavaWordCount lib/spark-examples-1.4.1-hadoop2.4.0.jar hdfs://hd02:/user/hadoop/input.txt
16/09/19 11:28:17 INFO spark.SparkContext: Running Spark version 1.4.1
16/09/19 11:28:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/09/19 11:28:18 INFO spark.SecurityManager: Changing view acls to: hadoop
16/09/19 11:28:18 INFO spark.SecurityManager: Changing modify acls to: hadoop
16/09/19 11:28:18 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
16/09/19 11:28:19 INFO slf4j.Slf4jLogger: Slf4jLogger started
16/09/19 11:28:19 INFO Remoting: Starting remoting
16/09/19 11:28:19 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.100.4:55817]
16/09/19 11:28:19 INFO util.Utils: Successfully started service 'sparkDriver' on port 55817.
16/09/19 11:28:19 INFO spark.SparkEnv: Registering MapOutputTracker
16/09/19 11:28:19 INFO spark.SparkEnv: Registering BlockManagerMaster
16/09/19 11:28:19 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-e19f684c-686b-4841-a863-2e143face3c3/blockmgr-b6ea03ec-ee30-4133-992f-ccf27ef35c93
16/09/19 11:28:19 INFO storage.MemoryStore: MemoryStore started with capacity 2.6 GB
16/09/19 11:28:19 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-e19f684c-686b-4841-a863-2e143face3c3/httpd-1c0a5a85-eef0-4e19-a244-ca7a9c852204
16/09/19 11:28:19 INFO spark.HttpServer: Starting HTTP Server
16/09/19 11:28:19 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/09/19 11:28:19 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:48267
16/09/19 11:28:19 INFO util.Utils: Successfully started service 'HTTP file server' on port 48267.
16/09/19 11:28:19 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/09/19 11:28:19 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/09/19 11:28:19 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:7106
16/09/19 11:28:19 INFO util.Utils: Successfully started service 'SparkUI' on port 7106.
16/09/19 11:28:19 INFO ui.SparkUI: Started SparkUI at http://192.168.100.4:7106
16/09/19 11:28:19 INFO spark.SparkContext: Added JAR file:/home/hadoop/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0.jar at http://192.168.100.4:48267/jars/spark-examples-1.4.1-hadoop2.4.0.jar with timestamp 1474255699921
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster@gzsw-02:7077/user/Master...
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20160919112820-0002
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/0 on worker-20160914175458-192.168.100.15-36198 (192.168.100.15:36198) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/0 on hostPort 192.168.100.15:36198 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/1 on worker-20160914175458-192.168.100.15-36198 (192.168.100.15:36198) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/1 on hostPort 192.168.100.15:36198 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/2 on worker-20160914175457-192.168.100.11-41800 (192.168.100.11:41800) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/2 on hostPort 192.168.100.11:41800 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/3 on worker-20160914175457-192.168.100.11-41800 (192.168.100.11:41800) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/3 on hostPort 192.168.100.11:41800 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/4 on worker-20160914175457-192.168.100.10-46154 (192.168.100.10:46154) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/4 on hostPort 192.168.100.10:46154 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/5 on worker-20160914175457-192.168.100.10-46154 (192.168.100.10:46154) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/5 on hostPort 192.168.100.10:46154 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/6 on worker-20160914175457-192.168.100.6-51383 (192.168.100.6:51383) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/6 on hostPort 192.168.100.6:51383 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/7 on worker-20160914175457-192.168.100.6-51383 (192.168.100.6:51383) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/7 on hostPort 192.168.100.6:51383 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/8 on worker-20160914175457-192.168.100.9-50567 (192.168.100.9:50567) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/8 on hostPort 192.168.100.9:50567 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/9 on worker-20160914175457-192.168.100.9-50567 (192.168.100.9:50567) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/9 on hostPort 192.168.100.9:50567 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/10 on worker-20160914175456-192.168.100.7-36541 (192.168.100.7:36541) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/10 on hostPort 192.168.100.7:36541 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/11 on worker-20160914175456-192.168.100.7-36541 (192.168.100.7:36541) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/11 on hostPort 192.168.100.7:36541 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/12 on worker-20160914175457-192.168.100.8-38650 (192.168.100.8:38650) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/12 on hostPort 192.168.100.8:38650 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/13 on worker-20160914175457-192.168.100.8-38650 (192.168.100.8:38650) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/13 on hostPort 192.168.100.8:38650 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/14 on worker-20160914175457-192.168.100.13-43911 (192.168.100.13:43911) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/14 on hostPort 192.168.100.13:43911 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/15 on worker-20160914175457-192.168.100.13-43911 (192.168.100.13:43911) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/15 on hostPort 192.168.100.13:43911 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/16 on worker-20160914175457-192.168.100.12-44199 (192.168.100.12:44199) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/16 on hostPort 192.168.100.12:44199 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/17 on worker-20160914175457-192.168.100.12-44199 (192.168.100.12:44199) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/17 on hostPort 192.168.100.12:44199 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/18 on worker-20160914175456-192.168.100.14-36693 (192.168.100.14:36693) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/18 on hostPort 192.168.100.14:36693 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor added: app-20160919112820-0002/19 on worker-20160914175456-192.168.100.14-36693 (192.168.100.14:36693) with 2 cores
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20160919112820-0002/19 on hostPort 192.168.100.14:36693 with 2 cores, 2.0 GB RAM
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/0 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/2 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/4 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/1 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/3 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/6 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/5 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/7 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/10 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/8 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/9 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/12 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/11 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/13 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/14 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/16 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/15 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/17 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/18 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/0 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/19 is now LOADING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/1 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/2 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/3 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/4 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/5 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/6 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/7 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/8 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/9 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/10 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/11 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/12 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/13 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/14 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/15 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/16 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/17 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/18 is now RUNNING
16/09/19 11:28:20 INFO client.AppClient$ClientActor: Executor updated: app-20160919112820-0002/19 is now RUNNING
16/09/19 11:28:20 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38299.
16/09/19 11:28:20 INFO netty.NettyBlockTransferService: Server created on 38299
16/09/19 11:28:20 INFO storage.BlockManagerMaster: Trying to register BlockManager
16/09/19 11:28:20 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.4:38299 with 2.6 GB RAM, BlockManagerId(driver, 192.168.100.4, 38299)
16/09/19 11:28:20 INFO storage.BlockManagerMaster: Registered BlockManager
16/09/19 11:28:20 INFO scheduler.EventLoggingListener: Logging events to file:/home/hadoop/spark/spark-eventlog/app-20160919112820-0002
16/09/19 11:28:20 INFO cluster.SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
16/09/19 11:28:21 INFO storage.MemoryStore: ensureFreeSpace(228680) called with curMem=0, maxMem=2778306969
16/09/19 11:28:21 INFO storage.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 223.3 KB, free 2.6 GB)
16/09/19 11:28:21 INFO storage.MemoryStore: ensureFreeSpace(18130) called with curMem=228680, maxMem=2778306969
16/09/19 11:28:21 INFO storage.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 17.7 KB, free 2.6 GB)
16/09/19 11:28:21 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.100.4:38299 (size: 17.7 KB, free: 2.6 GB)
16/09/19 11:28:21 INFO spark.SparkContext: Created broadcast 0 from textFile at JavaWordCount.java:45
16/09/19 11:28:21 INFO mapred.FileInputFormat: Total input paths to process : 1
16/09/19 11:28:21 INFO spark.SparkContext: Starting job: collect at JavaWordCount.java:68
16/09/19 11:28:21 INFO scheduler.DAGScheduler: Registering RDD 3 (mapToPair at JavaWordCount.java:54)
16/09/19 11:28:21 INFO scheduler.DAGScheduler: Got job 0 (collect at JavaWordCount.java:68) with 1 output partitions (allowLocal=false)
16/09/19 11:28:21 INFO scheduler.DAGScheduler: Final stage: ResultStage 1(collect at JavaWordCount.java:68)
16/09/19 11:28:21 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
16/09/19 11:28:21 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 0)
16/09/19 11:28:21 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at JavaWordCount.java:54), which has no missing parents
16/09/19 11:28:21 INFO storage.MemoryStore: ensureFreeSpace(4760) called with curMem=246810, maxMem=2778306969
16/09/19 11:28:21 INFO storage.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.6 KB, free 2.6 GB)
16/09/19 11:28:21 INFO storage.MemoryStore: ensureFreeSpace(2666) called with curMem=251570, maxMem=2778306969
16/09/19 11:28:21 INFO storage.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.6 KB, free 2.6 GB)
16/09/19 11:28:21 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.100.4:38299 (size: 2.6 KB, free: 2.6 GB)
16/09/19 11:28:21 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:874
16/09/19 11:28:21 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[3] at mapToPair at JavaWordCount.java:54)
16/09/19 11:28:21 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.14:32872/user/Executor#-695323121]) with ID 19
16/09/19 11:28:23 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.100.14, ANY, 1474 bytes)
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.10:34254/user/Executor#680848952]) with ID 4
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.12:45597/user/Executor#61882158]) with ID 17
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.11:55846/user/Executor#-109266911]) with ID 2
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.11:39056/user/Executor#-730178427]) with ID 3
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.6:41904/user/Executor#172197607]) with ID 7
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.15:48188/user/Executor#1126474595]) with ID 1
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.15:53231/user/Executor#643650421]) with ID 0
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.7:44498/user/Executor#1819346495]) with ID 10
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.10:33380/user/Executor#1519517929]) with ID 5
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.13:51226/user/Executor#107130314]) with ID 14
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.14:51287/user/Executor#-786570003]) with ID 18
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.8:42736/user/Executor#-1654106840]) with ID 13
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.8:46813/user/Executor#-1544202525]) with ID 12
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.13:59695/user/Executor#758748544]) with ID 15
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.9:32978/user/Executor#1666271797]) with ID 9
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.9:43395/user/Executor#1432829817]) with ID 8
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.6:55244/user/Executor#-583314465]) with ID 6
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.7:36554/user/Executor#464137168]) with ID 11
16/09/19 11:28:23 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.12:51715/user/Executor#1409392060]) with ID 16
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.14:41531 with 906.2 MB RAM, BlockManagerId(19, 192.168.100.14, 41531)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.10:55997 with 906.2 MB RAM, BlockManagerId(4, 192.168.100.10, 55997)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.12:39237 with 906.2 MB RAM, BlockManagerId(17, 192.168.100.12, 39237)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.11:55993 with 906.2 MB RAM, BlockManagerId(2, 192.168.100.11, 55993)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.10:50592 with 906.2 MB RAM, BlockManagerId(5, 192.168.100.10, 50592)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.6:53362 with 906.2 MB RAM, BlockManagerId(7, 192.168.100.6, 53362)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.11:38102 with 906.2 MB RAM, BlockManagerId(3, 192.168.100.11, 38102)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.7:36601 with 906.2 MB RAM, BlockManagerId(10, 192.168.100.7, 36601)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.14:40729 with 906.2 MB RAM, BlockManagerId(18, 192.168.100.14, 40729)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.8:53863 with 906.2 MB RAM, BlockManagerId(13, 192.168.100.8, 53863)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.13:43366 with 906.2 MB RAM, BlockManagerId(14, 192.168.100.13, 43366)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.9:54016 with 906.2 MB RAM, BlockManagerId(9, 192.168.100.9, 54016)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.6:60357 with 906.2 MB RAM, BlockManagerId(6, 192.168.100.6, 60357)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.9:47909 with 906.2 MB RAM, BlockManagerId(8, 192.168.100.9, 47909)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.15:46824 with 906.2 MB RAM, BlockManagerId(1, 192.168.100.15, 46824)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.15:48343 with 906.2 MB RAM, BlockManagerId(0, 192.168.100.15, 48343)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.8:32827 with 906.2 MB RAM, BlockManagerId(12, 192.168.100.8, 32827)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.13:52959 with 906.2 MB RAM, BlockManagerId(15, 192.168.100.13, 52959)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.7:33396 with 906.2 MB RAM, BlockManagerId(11, 192.168.100.7, 33396)
16/09/19 11:28:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.100.12:59404 with 906.2 MB RAM, BlockManagerId(16, 192.168.100.12, 59404)
16/09/19 11:28:25 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 192.168.100.14:41531 (size: 2.6 KB, free: 906.2 MB)
16/09/19 11:28:25 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.100.14:41531 (size: 17.7 KB, free: 906.2 MB)
16/09/19 11:28:26 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3398 ms on 192.168.100.14 (1/1)
16/09/19 11:28:26 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
16/09/19 11:28:26 INFO scheduler.DAGScheduler: ShuffleMapStage 0 (mapToPair at JavaWordCount.java:54) finished in 4.501 s
16/09/19 11:28:26 INFO scheduler.DAGScheduler: looking for newly runnable stages
16/09/19 11:28:26 INFO scheduler.DAGScheduler: running: Set()
16/09/19 11:28:26 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 1)
16/09/19 11:28:26 INFO scheduler.DAGScheduler: failed: Set()
16/09/19 11:28:26 INFO scheduler.DAGScheduler: Missing parents for ResultStage 1: List()
16/09/19 11:28:26 INFO scheduler.DAGScheduler: Submitting ResultStage 1 (ShuffledRDD[4] at reduceByKey at JavaWordCount.java:61), which is now runnable
16/09/19 11:28:26 INFO storage.MemoryStore: ensureFreeSpace(2408) called with curMem=254236, maxMem=2778306969
16/09/19 11:28:26 INFO storage.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 2.4 KB, free 2.6 GB)
16/09/19 11:28:26 INFO storage.MemoryStore: ensureFreeSpace(1458) called with curMem=256644, maxMem=2778306969
16/09/19 11:28:26 INFO storage.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1458.0 B, free 2.6 GB)
16/09/19 11:28:26 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.100.4:38299 (size: 1458.0 B, free: 2.6 GB)
16/09/19 11:28:26 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:874
16/09/19 11:28:26 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (ShuffledRDD[4] at reduceByKey at JavaWordCount.java:61)
16/09/19 11:28:26 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
16/09/19 11:28:26 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, 192.168.100.15, PROCESS_LOCAL, 1243 bytes)
16/09/19 11:28:28 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.100.4:38299 in memory (size: 2.6 KB, free: 2.6 GB)
16/09/19 11:28:28 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on 192.168.100.14:41531 in memory (size: 2.6 KB, free: 906.2 MB)
16/09/19 11:28:28 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 192.168.100.15:46824 (size: 1458.0 B, free: 906.2 MB)
16/09/19 11:28:28 INFO spark.MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 192.168.100.15:48188
16/09/19 11:28:28 INFO spark.MapOutputTrackerMaster: Size of output statuses for shuffle 0 is 144 bytes
16/09/19 11:28:28 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 2322 ms on 192.168.100.15 (1/1)
16/09/19 11:28:28 INFO scheduler.DAGScheduler: ResultStage 1 (collect at JavaWordCount.java:68) finished in 2.322 s
16/09/19 11:28:28 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 
16/09/19 11:28:28 INFO scheduler.DAGScheduler: Job 0 finished: collect at JavaWordCount.java:68, took 6.953234 s
are: 1
back: 1
is: 3
ERROR: 1
a: 2
on: 1
content: 2
bad: 2
with: 1
some: 1
INFO: 4
to: 1
: 2
This: 3
more: 1
message: 1
More: 1
thing: 1
warning: 1
WARN: 2
normal: 1
Something: 1
happened: 1
other: 1
messages: 2
details: 1
the: 1
Here: 1
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/09/19 11:28:28 INFO handler.ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/09/19 11:28:28 INFO ui.SparkUI: Stopped Spark web UI at http://192.168.100.4:7106
16/09/19 11:28:28 INFO scheduler.DAGScheduler: Stopping DAGScheduler
16/09/19 11:28:28 INFO cluster.SparkDeploySchedulerBackend: Shutting down all executors
16/09/19 11:28:28 INFO cluster.SparkDeploySchedulerBackend: Asking each executor to shut down
16/09/19 11:28:29 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/09/19 11:28:29 INFO util.Utils: path = /tmp/spark-e19f684c-686b-4841-a863-2e143face3c3/blockmgr-b6ea03ec-ee30-4133-992f-ccf27ef35c93, already present as root for deletion.
16/09/19 11:28:29 INFO storage.MemoryStore: MemoryStore cleared
16/09/19 11:28:29 INFO storage.BlockManager: BlockManager stopped
16/09/19 11:28:29 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
16/09/19 11:28:29 INFO spark.SparkContext: Successfully stopped SparkContext
16/09/19 11:28:29 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/09/19 11:28:29 INFO util.Utils: Shutdown hook called
16/09/19 11:28:29 INFO remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/09/19 11:28:29 INFO util.Utils: Deleting directory /tmp/spark-e19f684c-686b-4841-a863-2e143face3c3
16/09/19 11:28:29 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/09/19 11:28:29 INFO remote.RemoteActorRefProvider$RemotingTerminator: Remoting shut down.

   so we know the driver is run on local host(192.168.100.4:55817).

 

0
0
分享到:
评论

相关推荐

Global site tag (gtag.js) - Google Analytics