17/12/05 16:35:21 INFO spark.SparkContext: Running Spark version 2.1.0
17/12/05 16:35:21 WARN spark.SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/12/05 16:35:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/12/05 16:35:22 INFO spark.SecurityManager: Changing view acls to: root
17/12/05 16:35:22 INFO spark.SecurityManager: Changing modify acls to: root
17/12/05 16:35:22 INFO spark.SecurityManager: Changing view acls groups to:
17/12/05 16:35:22 INFO spark.SecurityManager: Changing modify acls groups to:
17/12/05 16:35:22 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
17/12/05 16:35:22 INFO util.Utils: Successfully started service 'sparkDriver' on port 36813.
17/12/05 16:35:22 INFO spark.SparkEnv: Registering MapOutputTracker
17/12/05 16:35:23 INFO spark.SparkEnv: Registering BlockManagerMaster
17/12/05 16:35:23 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapperfor getting topology information
17/12/05 16:35:23 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/12/05 16:35:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-91cf088d-df51-48ee-83da-e9b41c59016b
17/12/05 16:35:23 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
17/12/05 16:35:23 INFO spark.SparkEnv: Registering OutputCommitCoordinator
17/12/05 16:35:23 INFO util.log: Logging initialized @2218ms
17/12/05 16:35:23 INFO server.Server: jetty-9.2.z-SNAPSHOT
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@56ddcc4{/jobs,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fb8caa4{/jobs/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d000e49{/jobs/job,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3eaa021d{/jobs/job/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@b70de0f{/stages,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@699bb3d8{/stages/stage,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16c775c5{/stages/pool,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@104e432{/stages/pool/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68218f23{/storage,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@733c783d{/storage/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fa27e6{/storage/rdd,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1151709e{/storage/rdd/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79b89df3{/environment,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4789faf3{/environment/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33ba8c36{/executors,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c4b47c2{/executors/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@12542011{/executors/threadDump,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5105457d{/executors/threadDump/json,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@31153b19{/static,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68daff7b{/,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb1a05{/api,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22a93f26{/jobs/job/kill,null,AVAILABLE}
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1570e991{/stages/stage/kill,null,AVAILABLE}
17/12/05 16:35:23 INFO server.ServerConnector: Started ServerConnector@1403c543{HTTP/1.1}{0.0.0.0:4040}
17/12/05 16:35:23 INFO server.Server: Started @2401ms
17/12/05 16:35:23 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
17/12/05 16:35:23 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://172.17.0.2:4040
17/12/05 16:35:23 INFO spark.SparkContext: Added JAR file:/usr/local/spark/examples/jars/spark-examples_2.11-2.1.0.jar at spark://172.17.0.2:36813/jars/spark-examples_2.11-2.1.0.jar with timestamp 1512509723534
17/12/05 16:35:24 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
17/12/05 16:35:24 INFO yarn.Client: Requesting a new application from cluster with1 NodeManagers
17/12/05 16:35:24 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
17/12/05 16:35:24 INFO yarn.Client: Will allocate AM container, with896 MB memory including 384 MB overhead
17/12/05 16:35:24 INFO yarn.Client: Setting up container launch context for our AM
17/12/05 16:35:24 INFO yarn.Client: Setting up the launch environment for our AM container
17/12/05 16:35:24 INFO yarn.Client: Preparing resources for our AM container
17/12/05 16:35:26 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
17/12/05 16:35:29 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_libs__4075756737931461340.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_libs__4075756737931461340.zip
17/12/05 16:35:33 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_conf__1111616817187968052.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_conf__.zip
17/12/05 16:35:33 INFO spark.SecurityManager: Changing view acls to: root
17/12/05 16:35:33 INFO spark.SecurityManager: Changing modify acls to: root
17/12/05 16:35:33 INFO spark.SecurityManager: Changing view acls groups to:
17/12/05 16:35:33 INFO spark.SecurityManager: Changing modify acls groups to:
17/12/05 16:35:33 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
17/12/05 16:35:33 INFO yarn.Client: Submitting application application_1512509547564_0001 to ResourceManager
17/12/05 16:35:33 INFO impl.YarnClientImpl: Submitted application application_1512509547564_0001
17/12/05 16:35:33 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1512509547564_0001 and attemptId None
17/12/05 16:35:34 INFO yarn.Client: Application report for application_1512509547564_0001 (state: ACCEPTED)
17/12/05 16:35:51 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null)(172.17.0.2:42956)with ID 2
17/12/05 16:35:51 INFO cluster.YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio:0.8
17/12/05 16:45:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/12/05 16:45:28 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
17/12/05 16:45:28 INFO yarn.Client: Requesting a new application from cluster with1 NodeManagers
17/12/05 16:45:28 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container)
17/12/05 16:45:28 INFO yarn.Client: Will allocate AM container, with1408 MB memory including 384 MB overhead
17/12/05 16:45:28 INFO yarn.Client: Setting up container launch context for our AM
17/12/05 16:45:28 INFO yarn.Client: Setting up the launch environment for our AM container
17/12/05 16:45:28 INFO yarn.Client: Preparing resources for our AM container
17/12/05 16:45:30 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
17/12/05 16:45:32 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_libs__3702706301214674990.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_libs__3702706301214674990.zip
17/12/05 16:45:35 INFO yarn.Client: Uploading resource file:/usr/local/spark/examples/jars/spark-examples_2.11-2.1.0.jar -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/spark-examples_2.11-2.1.0.jar
17/12/05 16:45:36 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_conf__3925502010861004620.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_conf__.zip
17/12/05 16:45:36 INFO spark.SecurityManager: Changing view acls to: root
17/12/05 16:45:36 INFO spark.SecurityManager: Changing modify acls to: root
17/12/05 16:45:36 INFO spark.SecurityManager: Changing view acls groups to:
17/12/05 16:45:36 INFO spark.SecurityManager: Changing modify acls groups to:
17/12/05 16:45:36 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
17/12/05 16:45:36 INFO yarn.Client: Submitting application application_1512509547564_0003 to ResourceManager
17/12/05 16:45:36 INFO impl.YarnClientImpl: Submitted application application_1512509547564_0003
17/12/05 16:45:37 INFO yarn.Client: Application report for application_1512509547564_0003 (state: ACCEPTED)
cat: /release: No such file or directory
Main.scala:12: error: unclosed quoted identifier
chown: missing operand after `/usr/local/hadoop/logs'
^
Main.scala:13: error: unclosed quoted identifier
Try `chown --help' for more information.
^
Main.scala:21: error: Invalid literal number
> --driver-memory 1g \
^
Main.scala:22: error: Invalid literal number
> --executor-memory 1g \
^
Main.scala:33: error: unclosed character literal
17/12/05 16:35:22 INFO util.Utils: Successfully started service 'sparkDriver' on port 36813.
^
Main.scala:38: error: Invalid literal number
17/12/05 16:35:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-91cf088d-df51-48ee-83da-e9b41c59016b
^
Main.scala:38: error: Invalid literal number
17/12/05 16:35:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-91cf088d-df51-48ee-83da-e9b41c59016b
^
Main.scala:38: error: Invalid literal number
17/12/05 16:35:23 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-91cf088d-df51-48ee-83da-e9b41c59016b
^
Main.scala:41: error: Invalid literal number
17/12/05 16:35:23 INFO util.log: Logging initialized @2218ms
^
Main.scala:43: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@56ddcc4{/jobs,null,AVAILABLE}
^
Main.scala:44: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fb8caa4{/jobs/json,null,AVAILABLE}
^
Main.scala:45: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4d000e49{/jobs/job,null,AVAILABLE}
^
Main.scala:46: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3eaa021d{/jobs/job/json,null,AVAILABLE}
^
Main.scala:48: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,AVAILABLE}
^
Main.scala:48: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,AVAILABLE}
^
Main.scala:49: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@699bb3d8{/stages/stage,null,AVAILABLE}
^
Main.scala:50: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,AVAILABLE}
^
Main.scala:50: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,AVAILABLE}
^
Main.scala:51: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@16c775c5{/stages/pool,null,AVAILABLE}
^
Main.scala:53: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68218f23{/storage,null,AVAILABLE}
^
Main.scala:54: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@733c783d{/storage/json,null,AVAILABLE}
^
Main.scala:55: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6fa27e6{/storage/rdd,null,AVAILABLE}
^
Main.scala:56: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1151709e{/storage/rdd/json,null,AVAILABLE}
^
Main.scala:57: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@79b89df3{/environment,null,AVAILABLE}
^
Main.scala:58: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4789faf3{/environment/json,null,AVAILABLE}
^
Main.scala:59: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@33ba8c36{/executors,null,AVAILABLE}
^
Main.scala:60: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c4b47c2{/executors/json,null,AVAILABLE}
^
Main.scala:63: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@31153b19{/static,null,AVAILABLE}
^
Main.scala:64: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@68daff7b{/,null,AVAILABLE}
^
Main.scala:65: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb1a05{/api,null,AVAILABLE}
^
Main.scala:66: error: Invalid literal number
17/12/05 16:35:23 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22a93f26{/jobs/job/kill,null,AVAILABLE}
^
Main.scala:68: error: Invalid literal number
17/12/05 16:35:23 INFO server.ServerConnector: Started ServerConnector@1403c543{HTTP/1.1}{0.0.0.0:4040}
^
Main.scala:69: error: Invalid literal number
17/12/05 16:35:23 INFO server.Server: Started @2401ms
^
Main.scala:70: error: unclosed character literal
17/12/05 16:35:23 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
^
Main.scala:81: error: Invalid literal number
17/12/05 16:35:29 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_libs__4075756737931461340.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_libs__4075756737931461340.zip
^
Main.scala:81: error: Invalid literal number
17/12/05 16:35:29 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_libs__4075756737931461340.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_libs__4075756737931461340.zip
^
Main.scala:82: error: Invalid literal number
17/12/05 16:35:33 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_conf__1111616817187968052.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_conf__.zip
^
Main.scala:82: error: Invalid literal number
17/12/05 16:35:33 INFO yarn.Client: Uploading resource file:/tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4/__spark_conf__1111616817187968052.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0001/__spark_conf__.zip
^
Main.scala:126: error: unclosed character literal
17/12/05 16:35:44 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40773.
^
Main.scala:133: error: Invalid literal number
17/12/05 16:35:45 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@150b433a{/metrics/json,null,AVAILABLE}
^
Main.scala:139: error: unclosed character literal
17/12/05 16:35:51 INFO internal.SharedState: Warehouse path is 'file:/spark-warehouse'.
^
Main.scala:140: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bdb8254{/SQL,null,AVAILABLE}
^
Main.scala:141: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@594ae07{/SQL/json,null,AVAILABLE}
^
Main.scala:142: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4c3787ec{/SQL/execution,null,AVAILABLE}
^
Main.scala:143: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e60ee19{/SQL/execution/json,null,AVAILABLE}
^
Main.scala:144: error: Invalid literal number
17/12/05 16:35:51 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@254703a7{/static/sql,null,AVAILABLE}
^
Main.scala:167: error: Invalid literal number
17/12/05 16:35:56 INFO server.ServerConnector: Stopped ServerConnector@1403c543{HTTP/1.1}{0.0.0.0:4040}
^
Main.scala:169: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@22a93f26{/jobs/job/kill,null,UNAVAILABLE}
^
Main.scala:170: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1bb1a05{/api,null,UNAVAILABLE}
^
Main.scala:171: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@68daff7b{/,null,UNAVAILABLE}
^
Main.scala:172: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@31153b19{/static,null,UNAVAILABLE}
^
Main.scala:175: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1c4b47c2{/executors/json,null,UNAVAILABLE}
^
Main.scala:176: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@33ba8c36{/executors,null,UNAVAILABLE}
^
Main.scala:177: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4789faf3{/environment/json,null,UNAVAILABLE}
^
Main.scala:178: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@79b89df3{/environment,null,UNAVAILABLE}
^
Main.scala:179: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1151709e{/storage/rdd/json,null,UNAVAILABLE}
^
Main.scala:180: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6fa27e6{/storage/rdd,null,UNAVAILABLE}
^
Main.scala:181: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@733c783d{/storage/json,null,UNAVAILABLE}
^
Main.scala:182: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@68218f23{/storage,null,UNAVAILABLE}
^
Main.scala:184: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@16c775c5{/stages/pool,null,UNAVAILABLE}
^
Main.scala:185: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,UNAVAILABLE}
^
Main.scala:185: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6d3c6012{/stages/stage/json,null,UNAVAILABLE}
^
Main.scala:186: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@699bb3d8{/stages/stage,null,UNAVAILABLE}
^
Main.scala:187: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,UNAVAILABLE}
^
Main.scala:187: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1f02b0a7{/stages/json,null,UNAVAILABLE}
^
Main.scala:189: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3eaa021d{/jobs/job/json,null,UNAVAILABLE}
^
Main.scala:190: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4d000e49{/jobs/job,null,UNAVAILABLE}
^
Main.scala:191: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6fb8caa4{/jobs/json,null,UNAVAILABLE}
^
Main.scala:192: error: Invalid literal number
17/12/05 16:35:56 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@56ddcc4{/jobs,null,UNAVAILABLE}
^
Main.scala:209: error: Invalid literal number
17/12/05 16:35:56 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4
^
Main.scala:209: error: Invalid literal number
17/12/05 16:35:56 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-0de513c5-a853-46bd-b910-e5fdc52808f4
^
Main.scala:211: error: Invalid literal number
# spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 1g --executor-memory 1g --executor-cores 1 $SPARK_HOME/examples/jars/spark-examples*.jar
^
Main.scala:211: error: Invalid literal number
# spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 1g --executor-memory 1g --executor-cores 1 $SPARK_HOME/examples/jars/spark-examples*.jar
^
Main.scala:221: error: Invalid literal number
17/12/05 16:45:32 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_libs__3702706301214674990.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_libs__3702706301214674990.zip
^
Main.scala:221: error: Invalid literal number
17/12/05 16:45:32 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_libs__3702706301214674990.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_libs__3702706301214674990.zip
^
Main.scala:221: error: Invalid literal number
17/12/05 16:45:32 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_libs__3702706301214674990.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_libs__3702706301214674990.zip
^
Main.scala:223: error: Invalid literal number
17/12/05 16:45:36 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_conf__3925502010861004620.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_conf__.zip
^
Main.scala:223: error: Invalid literal number
17/12/05 16:45:36 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_conf__3925502010861004620.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_conf__.zip
^
Main.scala:223: error: Invalid literal number
17/12/05 16:45:36 INFO yarn.Client: Uploading resource file:/tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3/__spark_conf__3925502010861004620.zip -> hdfs://sandbox:9000/user/root/.sparkStaging/application_1512509547564_0003/__spark_conf__.zip
^
Main.scala:277: error: Invalid literal number
17/12/05 16:45:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3
^
Main.scala:277: error: Invalid literal number
17/12/05 16:45:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3
^
Main.scala:277: error: Invalid literal number
17/12/05 16:45:51 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-97a30f4d-103b-4556-b55d-8cb24d806ce3
^
Main.scala:1: error: expected class or object definition
$ docker run -it -p 8088:8088 -p 8042:8042 -h sandbox sequenceiq/spark:2.1.0 bash
^
Main.scala:2: error: expected class or object definition
/
^
Main.scala:3: error: expected class or object definition
Starting sshd: [ OK ]
^
Main.scala:4: error: expected class or object definition
Starting namenodes on [sandbox]
^
Main.scala:5: error: expected class or object definition
sandbox: starting namenode, logging to /usr/local/hadoop/logs/hadoop-root-namenode-sandbox.out
^
Main.scala:6: error: expected class or object definition
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-root-datanode-sandbox.out
^
Main.scala:7: error: expected class or object definition
Starting secondary namenodes [0.0.0.0]
^
Main.scala:8: error: expected class or object definition
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-root-secondarynamenode-sandbox.out
^
Main.scala:9: error: expected class or object definition
starting yarn daemons
^
Main.scala:10: error: expected class or object definition
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn--resourcemanager-sandbox.out
^
Main.scala:11: error: expected class or object definition
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-root-nodemanager-sandbox.out
^
Main.scala:12: error: expected class or object definition
chown: missing operand after `/usr/local/hadoop/logs'
^
Main.scala:15: error: expected class or object definition
bash-4.1#
^
Main.scala:19: error: expected class or object definition
> --class org.apache.spark.examples.SparkPi \
^
Main.scala:20: error: expected class or object definition
> --master yarn \
^
Main.scala:21: error: expected class or object definition
> --driver-memory 1g \
^
Main.scala:22: error: expected class or object definition
> --executor-memory 1g \
^
Main.scala:23: error: expected class or object definition
> --executor-cores 1 \
^
186 errors found
cat: /release: No such file or directory
Usage: scalac <options> <source files>
where possible standard options include:
-Dproperty=value Pass -Dproperty=value directly to the runtime system.
-J<flag> Pass <flag> directly to the runtime system.
-P:<plugin>:<opt> Pass an option to a plugin
-X Print a synopsis of advanced options.
-bootclasspath <path> Override location of bootstrap class files.
-classpath <path> Specify where to find user class files.
-d <directory|jar> destination for generated classfiles.
-dependencyfile <file> Set dependency tracking file.
-deprecation Emit warning and location for usages of deprecated APIs.
-encoding <encoding> Specify character encoding used by source files.
-explaintypes Explain type errors in more detail.
-extdirs <path> Override location of installed extensions.
-feature Emit warning and location for usages of features that should be imported explicitly.
-g:<level> Set level of generated debugging info. Choices: (none,source,line,vars,notailcalls), default: vars.
-help Print a synopsis of standard options
-javabootclasspath <path> Override java boot classpath.
-javaextdirs <path> Override java extdirs classpath.
-language:<_,feature,-feature> Enable or disable language features: `_' for all, `-language:help' to list choices.
-no-specialization Ignore @specialize annotations.
-nobootcp Do not use the boot classpath for the scala jars.
-nowarn Generate no warnings.
-opt:<_,optimization,-optimization> Enable optimizations: `_' for all, `-opt:help' to list choices.
-opt-warnings:<_,warning,-warning> Enable optimizer warnings: `_' for all, `-opt-warnings:help' to list choices.
-print Print program with Scala-specific features removed.
-sourcepath <path> Specify location(s) of source files.
-target:<target> Target platform for object files. All JVM 1.5 - 1.7 targets are deprecated. Choices: (jvm-1.5,jvm-1.6,jvm-1.7,jvm-1.8), default: jvm-1.8.
-toolcp <path> Add to the runner classpath.
-unchecked Enable additional warnings where generated code depends on assumptions.
-uniqid Uniquely tag all identifiers in debugging output.
-usejavacp Utilize the java.class.path in classpath resolution.
-usemanifestcp Utilize the manifest in classpath resolution.
-verbose Output messages about what the compiler is doing.
-version Print product version and exit.
@<file> A text file containing compiler arguments (options and source files)
Deprecated settings:
-optimise Compiler flag for the optimizer in Scala 2.11
deprecated: In 2.12, -optimise enables -opt:l:classpath. Check -opt:help for using the Scala 2.12 optimizer.
spoj: The program compiled successfully, but Main.class was not found.
Class Main should contain method: def main(args: Array[String]).