fork download
  1. $ docker run -it -p 8088:8088 -p 8042:8042 -h sandbox sequenceiq/spark:2.1.0 bash
  2. /
  3. Starting sshd: [ OK ]
  4. Starting namenodes on [sandbox]
  5. sandbox: starting namenode, logging to /usr/local/hadoop/logs/hadoop-root-namenode-sandbox.out
  6. localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-root-datanode-sandbox.out
  7. Starting secondary namenodes [0.0.0.0]
  8. 0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-root-secondarynamenode-sandbox.out
  9. starting yarn daemons
  10. starting resourcemanager, logging to /usr/local/hadoop/logs/yarn--resourcemanager-sandbox.out
  11. localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-root-nodemanager-sandbox.out
  12. chown: missing operand after `/usr/local/hadoop/logs'
  13. Try `chown --help' for more information.
  14. starting historyserver, logging to /usr/local/hadoop/logs/mapred--historyserver-sandbox.out
  15. bash-4.1#
  16. bash-4.1#
  17. bash-4.1#
  18. bash-4.1# spark-submit \
  19. > --class org.apache.spark.examples.SparkPi \
  20. > --master yarn \
  21. > --driver-memory 1g \
  22. > --executor-memory 1g \
  23. > --executor-cores 1 \
  24. > $SPARK_HOME/examples/jars/spark-examples*.jar
  25. 17/12/05 16:35:21 INFO spark.SparkContext: Running Spark version 2.1.0
  26. 17/12/05 16:35:21 WARN spark.SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
  27. 17/12/05 16:35:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  28. 17/12/05 16:35:22 INFO spark.SecurityManager: Changing view acls to: root
  29. 17/12/05 16:35:22 INFO spark.SecurityManager: Changing modify acls to: root
  30. 17/12/05 16:35:22 INFO spark.SecurityManager: Changing view acls groups to:
  31. 17/12/05 16:35:22 INFO spark.SecurityManager: Changing modify acls groups to:
  32. 17/12/05 16:35:22 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
  33. 17/12/05 16:35:22 INFO util.Utils: Successfully started service 'sparkDriver' on port 36813.
  34. 17/12/05 16:35:22 INFO spark.SparkEnv: Registering MapOutputTracker
  35. 17/12/05 16:35:23 INFO spark.SparkEnv: Registering BlockManagerMaster
  36. 17/12/05 16:35:23 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
  37. 17/12/05 16:35:23 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
  38. 17/12/05 16:35:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-91cf088d-df51-48ee-83da-e9b41c59016b
  39. 17/12/05 16:35:23 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
  40. 17/12/05 16:35:23 INFO spark.SparkEnv: Registering OutputCommitCoordinator
  41. 17/12/05 16:35:23 INFO util.log: Logging initialized @2218ms
  42. 17/12/05 16:35:23 INFO server.Server: jetty-9.2.z-SNAPSHOT
  43. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@56ddcc4{/jobs,null,AVAILABLE}
  44. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fb8caa4{/jobs/json,null,AVAILABLE}
  45. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d000e49{/jobs/job,null,AVAILABLE}
  46. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3eaa021d{/jobs/job/json,null,AVAILABLE}
  47. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@b70de0f{/stages,null,AVAILABLE}
  48. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,AVAILABLE}
  49. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@699bb3d8{/stages/stage,null,AVAILABLE}
  50. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,AVAILABLE}
  51. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16c775c5{/stages/pool,null,AVAILABLE}
  52. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@104e432{/stages/pool/json,null,AVAILABLE}
  53. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68218f23{/storage,null,AVAILABLE}
  54. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@733c783d{/storage/json,null,AVAILABLE}
  55. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fa27e6{/storage/rdd,null,AVAILABLE}
  56. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1151709e{/storage/rdd/json,null,AVAILABLE}
  57. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79b89df3{/environment,null,AVAILABLE}
  58. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4789faf3{/environment/json,null,AVAILABLE}
  59. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33ba8c36{/executors,null,AVAILABLE}
  60. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c4b47c2{/executors/json,null,AVAILABLE}
  61. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@12542011{/executors/threadDump,null,AVAILABLE}
  62. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5105457d{/executors/threadDump/json,null,AVAILABLE}
  63. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@31153b19{/static,null,AVAILABLE}
  64. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68daff7b{/,null,AVAILABLE}
  65. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb1a05{/api,null,AVAILABLE}
  66. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22a93f26{/jobs/job/kill,null,AVAILABLE}
  67. 17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1570e991{/stages/stage/kill,null,AVAILABLE}
  68. 17/12/05 16:35:23 INFO server.ServerConnector: Started ServerConnector@1403c543{HTTP/1.1}{0.0.0.0:4040}
  69. 17/12/05 16:35:23 INFO server.Server: Started @2401ms
  70. 17/12/05 16:35:23 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
  71. 17/12/05 16:35:23 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://172.17.0.2:4040
  72. 17/12/05 16:35:23 INFO spark.SparkContext: Added JAR file:/usr/local/spark/examples/jars/spark-examples_2.11-2.1.0.jar at spark://172.17.0.2:36813/jars/spark-examples_2.11-2.1.0.jar with timestamp 1512509723534
  73. 17/12/05 16:35:24 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
  74. 17/12/05 16:35:24 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
  75. 17/12/05 16:35:24 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
  76. 17/12/05 16:35:24 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
  77. 17/12/05 16:35:24 INFO yarn.Client: Setting up container launch context for our AM
  78. 17/12/05 16:35:24 INFO yarn.Client: Setting up the launch environment for our AM container
  79. 17/12/05 16:35:24 INFO yarn.Client: Preparing resources for our AM container
  80. 17/12/05 16:35:26 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
  81. 17/12/05 16:35:29 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_libs__4075756737931461340.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_libs__4075756737931461340.zip
  82. 17/12/05 16:35:33 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_conf__1111616817187968052.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_conf__.zip
  83. 17/12/05 16:35:33 INFO spark.SecurityManager: Changing view acls to: root
  84. 17/12/05 16:35:33 INFO spark.SecurityManager: Changing modify acls to: root
  85. 17/12/05 16:35:33 INFO spark.SecurityManager: Changing view acls groups to:
  86. 17/12/05 16:35:33 INFO spark.SecurityManager: Changing modify acls groups to:
  87. 17/12/05 16:35:33 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
  88. 17/12/05 16:35:33 INFO yarn.Client: Submitting application application_1512509547564_0001 to ResourceManager
  89. 17/12/05 16:35:33 INFO impl.YarnClientImpl: Submitted application application_1512509547564_0001
  90. 17/12/05 16:35:33 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1512509547564_0001 and attemptId None
  91. 17/12/05 16:35:34 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  92. 17/12/05 16:35:34 INFO yarn.Client:
  93. client token: N/A
  94. diagnostics: N/A
  95. ApplicationMaster host: N/A
  96. ApplicationMaster RPC port: -1
  97. queue: default
  98. start time: 1512509733690
  99. final status: UNDEFINED
  100. tracking URL: http://sandbox:8088/proxy/application_1512509547564_0001/
  101. user: root
  102. 17/12/05 16:35:35 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  103. 17/12/05 16:35:36 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  104. 17/12/05 16:35:37 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  105. 17/12/05 16:35:38 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  106. 17/12/05 16:35:39 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  107. 17/12/05 16:35:40 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  108. 17/12/05 16:35:41 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  109. 17/12/05 16:35:42 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  110. 17/12/05 16:35:43 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
  111. 17/12/05 16:35:44 INFO cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
  112. 17/12/05 16:35:44 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> sandbox, PROXY_URI_BASES -> http://sandbox:8088/proxy/application_1512509547564_0001), /proxy/application_1512509547564_0001
  113. 17/12/05 16:35:44 INFO ui.JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
  114. 17/12/05 16:35:44 INFO yarn.Client: Application report for application_1512509547564_0001 (state: RUNNING)
  115. 17/12/05 16:35:44 INFO yarn.Client:
  116. client token: N/A
  117. diagnostics: N/A
  118. ApplicationMaster host: 172.17.0.2
  119. ApplicationMaster RPC port: 0
  120. queue: default
  121. start time: 1512509733690
  122. final status: UNDEFINED
  123. tracking URL: http://sandbox:8088/proxy/application_1512509547564_0001/
  124. user: root
  125. 17/12/05 16:35:44 INFO cluster.YarnClientSchedulerBackend: Application application_1512509547564_0001 has started running.
  126. 17/12/05 16:35:44 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40773.
  127. 17/12/05 16:35:44 INFO netty.NettyBlockTransferService: Server created on 172.17.0.2:40773
  128. 17/12/05 16:35:44 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
  129. 17/12/05 16:35:44 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 172.17.0.2, 40773, None)
  130. 17/12/05 16:35:44 INFO storage.BlockManagerMasterEndpoint: Registering block manager 172.17.0.2:40773 with 366.3 MB RAM, BlockManagerId(driver, 172.17.0.2, 40773, None)
  131. 17/12/05 16:35:44 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 172.17.0.2, 40773, None)
  132. 17/12/05 16:35:44 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 172.17.0.2, 40773, None)
  133. 17/12/05 16:35:45 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@150b433a{/metrics/json,null,AVAILABLE}
  134. 17/12/05 16:35:50 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null) (172.17.0.2:42952) with ID 1
  135. 17/12/05 16:35:51 INFO storage.BlockManagerMasterEndpoint: Registering block manager sandbox:46793 with 366.3 MB RAM, BlockManagerId(1, sandbox, 46793, None)
  136. 17/12/05 16:35:51 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null) (172.17.0.2:42956) with ID 2
  137. 17/12/05 16:35:51 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
  138. 17/12/05 16:35:51 INFO storage.BlockManagerMasterEndpoint: Registering block manager sandbox:42597 with 366.3 MB RAM, BlockManagerId(2, sandbox, 42597, None)
  139. 17/12/05 16:35:51 INFO internal.SharedState: Warehouse path is 'file:/spark-warehouse'.
  140. 17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bdb8254{/SQL,null,AVAILABLE}
  141. 17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@594ae07{/SQL/json,null,AVAILABLE}
  142. 17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4c3787ec{/SQL/execution,null,AVAILABLE}
  143. 17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e60ee19{/SQL/execution/json,null,AVAILABLE}
  144. 17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@254703a7{/static/sql,null,AVAILABLE}
  145. 17/12/05 16:35:52 INFO spark.SparkContext: Starting job: reduce at SparkPi.scala:38
  146. 17/12/05 16:35:52 INFO scheduler.DAGScheduler: Got job 0 (reduce at SparkPi.scala:38) with 2 output partitions
  147. 17/12/05 16:35:52 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:38)
  148. 17/12/05 16:35:52 INFO scheduler.DAGScheduler: Parents of final stage: List()
  149. 17/12/05 16:35:52 INFO scheduler.DAGScheduler: Missing parents: List()
  150. 17/12/05 16:35:52 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents
  151. 17/12/05 16:35:52 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1832.0 B, free 366.3 MB)
  152. 17/12/05 16:35:52 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1172.0 B, free 366.3 MB)
  153. 17/12/05 16:35:52 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 172.17.0.2:40773 (size: 1172.0 B, free: 366.3 MB)
  154. 17/12/05 16:35:52 INFO spark.SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:996
  155. 17/12/05 16:35:52 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34)
  156. 17/12/05 16:35:52 INFO cluster.YarnScheduler: Adding task set 0.0 with 2 tasks
  157. 17/12/05 16:35:52 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, sandbox, executor 2, partition 0, PROCESS_LOCAL, 6033 bytes)
  158. 17/12/05 16:35:52 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, sandbox, executor 1, partition 1, PROCESS_LOCAL, 6033 bytes)
  159. 17/12/05 16:35:55 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on sandbox:42597 (size: 1172.0 B, free: 366.3 MB)
  160. 17/12/05 16:35:55 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on sandbox:46793 (size: 1172.0 B, free: 366.3 MB)
  161. 17/12/05 16:35:56 INFO scheduler.TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 3159 ms on sandbox (executor 1) (1/2)
  162. 17/12/05 16:35:56 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 3214 ms on sandbox (executor 2) (2/2)
  163. 17/12/05 16:35:56 INFO cluster.YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool
  164. 17/12/05 16:35:56 INFO scheduler.DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) finished in 3.232 s
  165. 17/12/05 16:35:56 INFO scheduler.DAGScheduler: Job 0 finished: reduce at SparkPi.scala:38, took 3.632113 s
  166. Pi is roughly 3.13959569797849
  167. 17/12/05 16:35:56 INFO server.ServerConnector: Stopped ServerConnector@1403c543{HTTP/1.1}{0.0.0.0:4040}
  168. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1570e991{/stages/stage/kill,null,UNAVAILABLE}
  169. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@22a93f26{/jobs/job/kill,null,UNAVAILABLE}
  170. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1bb1a05{/api,null,UNAVAILABLE}
  171. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@68daff7b{/,null,UNAVAILABLE}
  172. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@31153b19{/static,null,UNAVAILABLE}
  173. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5105457d{/executors/threadDump/json,null,UNAVAILABLE}
  174. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@12542011{/executors/threadDump,null,UNAVAILABLE}
  175. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1c4b47c2{/executors/json,null,UNAVAILABLE}
  176. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@33ba8c36{/executors,null,UNAVAILABLE}
  177. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4789faf3{/environment/json,null,UNAVAILABLE}
  178. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@79b89df3{/environment,null,UNAVAILABLE}
  179. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1151709e{/storage/rdd/json,null,UNAVAILABLE}
  180. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6fa27e6{/storage/rdd,null,UNAVAILABLE}
  181. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@733c783d{/storage/json,null,UNAVAILABLE}
  182. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@68218f23{/storage,null,UNAVAILABLE}
  183. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@104e432{/stages/pool/json,null,UNAVAILABLE}
  184. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@16c775c5{/stages/pool,null,UNAVAILABLE}
  185. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,UNAVAILABLE}
  186. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@699bb3d8{/stages/stage,null,UNAVAILABLE}
  187. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,UNAVAILABLE}
  188. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@b70de0f{/stages,null,UNAVAILABLE}
  189. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3eaa021d{/jobs/job/json,null,UNAVAILABLE}
  190. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4d000e49{/jobs/job,null,UNAVAILABLE}
  191. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6fb8caa4{/jobs/json,null,UNAVAILABLE}
  192. 17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@56ddcc4{/jobs,null,UNAVAILABLE}
  193. 17/12/05 16:35:56 INFO ui.SparkUI: Stopped Spark web UI at http://172.17.0.2:4040
  194. 17/12/05 16:35:56 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
  195. 17/12/05 16:35:56 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
  196. 17/12/05 16:35:56 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
  197. 17/12/05 16:35:56 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
  198. (serviceOption=None,
  199. services=List(),
  200. started=false)
  201. 17/12/05 16:35:56 INFO cluster.YarnClientSchedulerBackend: Stopped
  202. 17/12/05 16:35:56 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
  203. 17/12/05 16:35:56 INFO memory.MemoryStore: MemoryStore cleared
  204. 17/12/05 16:35:56 INFO storage.BlockManager: BlockManager stopped
  205. 17/12/05 16:35:56 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
  206. 17/12/05 16:35:56 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
  207. 17/12/05 16:35:56 INFO spark.SparkContext: Successfully stopped SparkContext
  208. 17/12/05 16:35:56 INFO util.ShutdownHookManager: Shutdown hook called
  209. 17/12/05 16:35:56 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4
  210.  
  211. # spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 1g --executor-memory 1g --executor-cores 1 $SPARK_HOME/examples/jars/spark-examples*.jar
  212. 17/12/05 16:45:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  213. 17/12/05 16:45:28 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
  214. 17/12/05 16:45:28 INFO yarn.Client: Requesting a new application from cluster with 1 NodeManagers
  215. 17/12/05 16:45:28 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
  216. 17/12/05 16:45:28 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead
  217. 17/12/05 16:45:28 INFO yarn.Client: Setting up container launch context for our AM
  218. 17/12/05 16:45:28 INFO yarn.Client: Setting up the launch environment for our AM container
  219. 17/12/05 16:45:28 INFO yarn.Client: Preparing resources for our AM container
  220. 17/12/05 16:45:30 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
  221. 17/12/05 16:45:32 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_libs__3702706301214674990.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_libs__3702706301214674990.zip
  222. 17/12/05 16:45:35 INFO yarn.Client: Uploading resource file:/usr/local/spark/examples/jars/spark-examples_2.11-2.1.0.jar -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/spark-examples_2.11-2.1.0.jar
  223. 17/12/05 16:45:36 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_conf__3925502010861004620.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_conf__.zip
  224. 17/12/05 16:45:36 INFO spark.SecurityManager: Changing view acls to: root
  225. 17/12/05 16:45:36 INFO spark.SecurityManager: Changing modify acls to: root
  226. 17/12/05 16:45:36 INFO spark.SecurityManager: Changing view acls groups to:
  227. 17/12/05 16:45:36 INFO spark.SecurityManager: Changing modify acls groups to:
  228. 17/12/05 16:45:36 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
  229. 17/12/05 16:45:36 INFO yarn.Client: Submitting application application_1512509547564_0003 to ResourceManager
  230. 17/12/05 16:45:36 INFO impl.YarnClientImpl: Submitted application application_1512509547564_0003
  231. 17/12/05 16:45:37 INFO yarn.Client: Application report for application_1512509547564_0003 (state: ACCEPTED)
  232. 17/12/05 16:45:37 INFO yarn.Client:
  233. client token: N/A
  234. diagnostics: N/A
  235. ApplicationMaster host: N/A
  236. ApplicationMaster RPC port: -1
  237. queue: default
  238. start time: 1512510336231
  239. final status: UNDEFINED
  240. tracking URL: http://sandbox:8088/proxy/application_1512509547564_0003/
  241. user: root
  242. 17/12/05 16:45:38 INFO yarn.Client: Application report for application_1512509547564_0003 (state: ACCEPTED)
  243. 17/12/05 16:45:40 INFO yarn.Client: Application report for application_1512509547564_0003 (state: ACCEPTED)
  244. 17/12/05 16:45:41 INFO yarn.Client: Application report for application_1512509547564_0003 (state: ACCEPTED)
  245. 17/12/05 16:45:42 INFO yarn.Client: Application report for application_1512509547564_0003 (state: ACCEPTED)
  246. 17/12/05 16:45:43 INFO yarn.Client: Application report for application_1512509547564_0003 (state: ACCEPTED)
  247. 17/12/05 16:45:44 INFO yarn.Client: Application report for application_1512509547564_0003 (state: RUNNING)
  248. 17/12/05 16:45:44 INFO yarn.Client:
  249. client token: N/A
  250. diagnostics: N/A
  251. ApplicationMaster host: 172.17.0.2
  252. ApplicationMaster RPC port: 0
  253. queue: default
  254. start time: 1512510336231
  255. final status: UNDEFINED
  256. tracking URL: http://sandbox:8088/proxy/application_1512509547564_0003/
  257. user: root
  258. 17/12/05 16:45:45 INFO yarn.Client: Application report for application_1512509547564_0003 (state: RUNNING)
  259. 17/12/05 16:45:46 INFO yarn.Client: Application report for application_1512509547564_0003 (state: RUNNING)
  260. 17/12/05 16:45:47 INFO yarn.Client: Application report for application_1512509547564_0003 (state: RUNNING)
  261. 17/12/05 16:45:48 INFO yarn.Client: Application report for application_1512509547564_0003 (state: RUNNING)
  262. 17/12/05 16:45:49 INFO yarn.Client: Application report for application_1512509547564_0003 (state: RUNNING)
  263. 17/12/05 16:45:50 INFO yarn.Client: Application report for application_1512509547564_0003 (state: RUNNING)
  264. 17/12/05 16:45:51 INFO yarn.Client: Application report for application_1512509547564_0003 (state: FINISHED)
  265. 17/12/05 16:45:51 INFO yarn.Client:
  266. client token: N/A
  267. diagnostics: N/A
  268. ApplicationMaster host: 172.17.0.2
  269. ApplicationMaster RPC port: 0
  270. queue: default
  271. start time: 1512510336231
  272. final status: SUCCEEDED
  273. tracking URL: http://sandbox:8088/proxy/application_1512509547564_0003/
  274. user: root
  275. 17/12/05 16:45:51 INFO yarn.Client: Deleted staging directory hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003
  276. 17/12/05 16:45:51 INFO util.ShutdownHookManager: Shutdown hook called
  277. 17/12/05 16:45:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3
Compilation error #stdin compilation error #stdout 0s 0KB
stdin
Standard input is empty
compilation info
cat: /release: No such file or directory
Main.scala:12: error: unclosed quoted identifier
chown: missing operand after `/usr/local/hadoop/logs'
                             ^
Main.scala:13: error: unclosed quoted identifier
Try `chown --help' for more information.
    ^
Main.scala:21: error: Invalid literal number
> --driver-memory 1g \
                  ^
Main.scala:22: error: Invalid literal number
> --executor-memory 1g \
                    ^
Main.scala:33: error: unclosed character literal
17/12/05 16:35:22 INFO util.Utils: Successfully started service 'sparkDriver' on port 36813.
                                                                            ^
Main.scala:38: error: Invalid literal number
17/12/05 16:35:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-91cf088d-df51-48ee-83da-e9b41c59016b
                                                                                          ^
Main.scala:38: error: Invalid literal number
17/12/05 16:35:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-91cf088d-df51-48ee-83da-e9b41c59016b
                                                                                                        ^
Main.scala:38: error: Invalid literal number
17/12/05 16:35:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-91cf088d-df51-48ee-83da-e9b41c59016b
                                                                                                             ^
Main.scala:41: error: Invalid literal number
17/12/05 16:35:23 INFO util.log: Logging initialized @2218ms
                                                      ^
Main.scala:43: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@56ddcc4{/jobs,null,AVAILABLE}
                                                                                     ^
Main.scala:44: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fb8caa4{/jobs/json,null,AVAILABLE}
                                                                                     ^
Main.scala:45: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d000e49{/jobs/job,null,AVAILABLE}
                                                                                     ^
Main.scala:46: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3eaa021d{/jobs/job/json,null,AVAILABLE}
                                                                                     ^
Main.scala:48: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,AVAILABLE}
                                                                                     ^
Main.scala:48: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,AVAILABLE}
                                                                                       ^
Main.scala:49: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@699bb3d8{/stages/stage,null,AVAILABLE}
                                                                                     ^
Main.scala:50: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,AVAILABLE}
                                                                                     ^
Main.scala:50: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,AVAILABLE}
                                                                                       ^
Main.scala:51: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16c775c5{/stages/pool,null,AVAILABLE}
                                                                                     ^
Main.scala:53: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68218f23{/storage,null,AVAILABLE}
                                                                                     ^
Main.scala:54: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@733c783d{/storage/json,null,AVAILABLE}
                                                                                     ^
Main.scala:55: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fa27e6{/storage/rdd,null,AVAILABLE}
                                                                                     ^
Main.scala:56: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1151709e{/storage/rdd/json,null,AVAILABLE}
                                                                                     ^
Main.scala:57: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79b89df3{/environment,null,AVAILABLE}
                                                                                     ^
Main.scala:58: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4789faf3{/environment/json,null,AVAILABLE}
                                                                                     ^
Main.scala:59: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33ba8c36{/executors,null,AVAILABLE}
                                                                                     ^
Main.scala:60: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c4b47c2{/executors/json,null,AVAILABLE}
                                                                                     ^
Main.scala:63: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@31153b19{/static,null,AVAILABLE}
                                                                                     ^
Main.scala:64: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68daff7b{/,null,AVAILABLE}
                                                                                     ^
Main.scala:65: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb1a05{/api,null,AVAILABLE}
                                                                                     ^
Main.scala:66: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22a93f26{/jobs/job/kill,null,AVAILABLE}
                                                                                     ^
Main.scala:68: error: Invalid literal number
17/12/05 16:35:23 INFO server.ServerConnector: Started ServerConnector@1403c543{HTTP/1.1}{0.0.0.0:4040}
                                                                       ^
Main.scala:69: error: Invalid literal number
17/12/05 16:35:23 INFO server.Server: Started @2401ms
                                               ^
Main.scala:70: error: unclosed character literal
17/12/05 16:35:23 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
                                                                        ^
Main.scala:81: error: Invalid literal number
17/12/05 16:35:29 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_libs__4075756737931461340.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_libs__4075756737931461340.zip
                                                                       ^
Main.scala:81: error: Invalid literal number
17/12/05 16:35:29 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_libs__4075756737931461340.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_libs__4075756737931461340.zip
                                                                                     ^
Main.scala:82: error: Invalid literal number
17/12/05 16:35:33 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_conf__1111616817187968052.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_conf__.zip
                                                                       ^
Main.scala:82: error: Invalid literal number
17/12/05 16:35:33 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_conf__1111616817187968052.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_conf__.zip
                                                                                     ^
Main.scala:126: error: unclosed character literal
17/12/05 16:35:44 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40773.
                                                                                                                         ^
Main.scala:133: error: Invalid literal number
17/12/05 16:35:45 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@150b433a{/metrics/json,null,AVAILABLE}
                                                                                     ^
Main.scala:139: error: unclosed character literal
17/12/05 16:35:51 INFO internal.SharedState: Warehouse path is 'file:/spark-warehouse'.
                                                                                     ^
Main.scala:140: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bdb8254{/SQL,null,AVAILABLE}
                                                                                     ^
Main.scala:141: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@594ae07{/SQL/json,null,AVAILABLE}
                                                                                     ^
Main.scala:142: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4c3787ec{/SQL/execution,null,AVAILABLE}
                                                                                     ^
Main.scala:143: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e60ee19{/SQL/execution/json,null,AVAILABLE}
                                                                                     ^
Main.scala:144: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@254703a7{/static/sql,null,AVAILABLE}
                                                                                     ^
Main.scala:167: error: Invalid literal number
17/12/05 16:35:56 INFO server.ServerConnector: Stopped ServerConnector@1403c543{HTTP/1.1}{0.0.0.0:4040}
                                                                       ^
Main.scala:169: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@22a93f26{/jobs/job/kill,null,UNAVAILABLE}
                                                                                     ^
Main.scala:170: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1bb1a05{/api,null,UNAVAILABLE}
                                                                                     ^
Main.scala:171: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@68daff7b{/,null,UNAVAILABLE}
                                                                                     ^
Main.scala:172: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@31153b19{/static,null,UNAVAILABLE}
                                                                                     ^
Main.scala:175: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1c4b47c2{/executors/json,null,UNAVAILABLE}
                                                                                     ^
Main.scala:176: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@33ba8c36{/executors,null,UNAVAILABLE}
                                                                                     ^
Main.scala:177: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4789faf3{/environment/json,null,UNAVAILABLE}
                                                                                     ^
Main.scala:178: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@79b89df3{/environment,null,UNAVAILABLE}
                                                                                     ^
Main.scala:179: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1151709e{/storage/rdd/json,null,UNAVAILABLE}
                                                                                     ^
Main.scala:180: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6fa27e6{/storage/rdd,null,UNAVAILABLE}
                                                                                     ^
Main.scala:181: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@733c783d{/storage/json,null,UNAVAILABLE}
                                                                                     ^
Main.scala:182: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@68218f23{/storage,null,UNAVAILABLE}
                                                                                     ^
Main.scala:184: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@16c775c5{/stages/pool,null,UNAVAILABLE}
                                                                                     ^
Main.scala:185: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,UNAVAILABLE}
                                                                                     ^
Main.scala:185: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,UNAVAILABLE}
                                                                                       ^
Main.scala:186: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@699bb3d8{/stages/stage,null,UNAVAILABLE}
                                                                                     ^
Main.scala:187: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,UNAVAILABLE}
                                                                                     ^
Main.scala:187: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,UNAVAILABLE}
                                                                                       ^
Main.scala:189: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3eaa021d{/jobs/job/json,null,UNAVAILABLE}
                                                                                     ^
Main.scala:190: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4d000e49{/jobs/job,null,UNAVAILABLE}
                                                                                     ^
Main.scala:191: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6fb8caa4{/jobs/json,null,UNAVAILABLE}
                                                                                     ^
Main.scala:192: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@56ddcc4{/jobs,null,UNAVAILABLE}
                                                                                     ^
Main.scala:209: error: Invalid literal number
17/12/05 16:35:56 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4
                                                                               ^
Main.scala:209: error: Invalid literal number
17/12/05 16:35:56 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4
                                                                                             ^
Main.scala:211: error: Invalid literal number
# spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 1g --executor-memory 1g --executor-cores 1 $SPARK_HOME/examples/jars/spark-examples*.jar
                                                                                                             ^
Main.scala:211: error: Invalid literal number
# spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 1g --executor-memory 1g --executor-cores 1 $SPARK_HOME/examples/jars/spark-examples*.jar
                                                                                                                                  ^
Main.scala:221: error: Invalid literal number
17/12/05 16:45:32 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_libs__3702706301214674990.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_libs__3702706301214674990.zip
                                                                       ^
Main.scala:221: error: Invalid literal number
17/12/05 16:45:32 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_libs__3702706301214674990.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_libs__3702706301214674990.zip
                                                                                ^
Main.scala:221: error: Invalid literal number
17/12/05 16:45:32 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_libs__3702706301214674990.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_libs__3702706301214674990.zip
                                                                                               ^
Main.scala:223: error: Invalid literal number
17/12/05 16:45:36 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_conf__3925502010861004620.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_conf__.zip
                                                                       ^
Main.scala:223: error: Invalid literal number
17/12/05 16:45:36 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_conf__3925502010861004620.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_conf__.zip
                                                                                ^
Main.scala:223: error: Invalid literal number
17/12/05 16:45:36 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_conf__3925502010861004620.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_conf__.zip
                                                                                               ^
Main.scala:277: error: Invalid literal number
17/12/05 16:45:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3
                                                                               ^
Main.scala:277: error: Invalid literal number
17/12/05 16:45:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3
                                                                                        ^
Main.scala:277: error: Invalid literal number
17/12/05 16:45:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3
                                                                                                       ^
Main.scala:1: error: expected class or object definition
$ docker run -it -p 8088:8088 -p 8042:8042 -h sandbox sequenceiq/spark:2.1.0 bash
^
Main.scala:2: error: expected class or object definition
/
^
Main.scala:3: error: expected class or object definition
Starting sshd:                                             [  OK  ]
^
Main.scala:4: error: expected class or object definition
Starting namenodes on [sandbox]
^
Main.scala:5: error: expected class or object definition
sandbox: starting namenode, logging to /usr/local/hadoop/logs/hadoop-root-namenode-sandbox.out
^
Main.scala:6: error: expected class or object definition
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-root-datanode-sandbox.out
^
Main.scala:7: error: expected class or object definition
Starting secondary namenodes [0.0.0.0]
^
Main.scala:8: error: expected class or object definition
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-root-secondarynamenode-sandbox.out
^
Main.scala:9: error: expected class or object definition
starting yarn daemons
^
Main.scala:10: error: expected class or object definition
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn--resourcemanager-sandbox.out
^
Main.scala:11: error: expected class or object definition
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-root-nodemanager-sandbox.out
^
Main.scala:12: error: expected class or object definition
chown: missing operand after `/usr/local/hadoop/logs'
^
Main.scala:15: error: expected class or object definition
bash-4.1#
^
Main.scala:19: error: expected class or object definition
> --class org.apache.spark.examples.SparkPi \
^
Main.scala:20: error: expected class or object definition
> --master yarn \
^
Main.scala:21: error: expected class or object definition
> --driver-memory 1g \
^
Main.scala:22: error: expected class or object definition
> --executor-memory 1g \
^
Main.scala:23: error: expected class or object definition
> --executor-cores 1 \
^
186 errors found
cat: /release: No such file or directory
Usage: scalac <options> <source files>
where possible standard options include:
  -Dproperty=value                     Pass -Dproperty=value directly to the runtime system.
  -J<flag>                             Pass <flag> directly to the runtime system.
  -P:<plugin>:<opt>                    Pass an option to a plugin
  -X                                   Print a synopsis of advanced options.
  -bootclasspath <path>                Override location of bootstrap class files.
  -classpath <path>                    Specify where to find user class files.
  -d <directory|jar>                   destination for generated classfiles.
  -dependencyfile <file>               Set dependency tracking file.
  -deprecation                         Emit warning and location for usages of deprecated APIs.
  -encoding <encoding>                 Specify character encoding used by source files.
  -explaintypes                        Explain type errors in more detail.
  -extdirs <path>                      Override location of installed extensions.
  -feature                             Emit warning and location for usages of features that should be imported explicitly.
  -g:<level>                           Set level of generated debugging info. Choices: (none,source,line,vars,notailcalls), default: vars.
  -help                                Print a synopsis of standard options
  -javabootclasspath <path>            Override java boot classpath.
  -javaextdirs <path>                  Override java extdirs classpath.
  -language:<_,feature,-feature>       Enable or disable language features: `_' for all, `-language:help' to list choices.
  -no-specialization                   Ignore @specialize annotations.
  -nobootcp                            Do not use the boot classpath for the scala jars.
  -nowarn                              Generate no warnings.
  -opt:<_,optimization,-optimization>  Enable optimizations: `_' for all, `-opt:help' to list choices.
  -opt-warnings:<_,warning,-warning>   Enable optimizer warnings: `_' for all, `-opt-warnings:help' to list choices.
  -print                               Print program with Scala-specific features removed.
  -sourcepath <path>                   Specify location(s) of source files.
  -target:<target>                     Target platform for object files. All JVM 1.5 - 1.7 targets are deprecated. Choices: (jvm-1.5,jvm-1.6,jvm-1.7,jvm-1.8), default: jvm-1.8.
  -toolcp <path>                       Add to the runner classpath.
  -unchecked                           Enable additional warnings where generated code depends on assumptions.
  -uniqid                              Uniquely tag all identifiers in debugging output.
  -usejavacp                           Utilize the java.class.path in classpath resolution.
  -usemanifestcp                       Utilize the manifest in classpath resolution.
  -verbose                             Output messages about what the compiler is doing.
  -version                             Print product version and exit.
  @<file>                              A text file containing compiler arguments (options and source files)

Deprecated settings:
  -optimise                            Compiler flag for the optimizer in Scala 2.11
                                         deprecated: In 2.12, -optimise enables -opt:l:classpath. Check -opt:help for using the Scala 2.12 optimizer.

spoj: The program compiled successfully, but Main.class was not found.
      Class Main should contain method: def main(args: Array[String]).
stdout
Standard output is empty