17/02/28 10:47:06 INFO StateStore$: Using BlackholeStateStore for recovery. 17/02/28 10:47:06 INFO BatchSessionManager: Recovered 0 batch sessions. Next session id: 0 17/02/28 10:47:06 INFO InteractiveSessionManager: Recovered 0 interactive sessions. Next session id: 0 17/02/28 10:47:06 INFO InteractiveSessionManager: Heartbeat watchdog thread started. 17/02/28 10:47:06 INFO RMProxy: Connecting to ResourceManager at ip-10-200-139-129.ec2.internal/10.200.139.129:8032 17/02/28 10:47:06 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/02/28 10:47:06 INFO WebServer: Starting server on http://ip-10-200-139-129.ec2.internal:8998 17/02/28 10:47:42 INFO InteractiveSession$: Creating LivyClient for sessionId: 0 17/02/28 10:47:42 WARN RSCConf: Your hostname, ip-10-200-139-129.ec2.internal, resolves to a loopback address, but we couldn't find any external IP address! 17/02/28 10:47:42 WARN RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address. 17/02/28 10:47:42 INFO InteractiveSessionManager: Registering new session 0 17/02/28 10:47:44 INFO ContextLauncher: Ivy Default Cache set to: /home/hadoop/.ivy2/cache 17/02/28 10:47:44 INFO ContextLauncher: The jars for the packages stored in: /home/hadoop/.ivy2/jars 17/02/28 10:47:44 INFO ContextLauncher: :: loading settings :: url = jar:file:/usr/lib/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml 17/02/28 10:47:44 INFO ContextLauncher: com.datastax.spark#spark-cassandra-connector_2.11 added as a dependency 17/02/28 10:47:44 INFO ContextLauncher: org.mongodb.spark#mongo-spark-connector_2.10 added as a dependency 17/02/28 10:47:44 INFO ContextLauncher: :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 17/02/28 10:47:44 INFO ContextLauncher: confs: [default] 17/02/28 10:47:44 INFO ContextLauncher: found com.datastax.spark#spark-cassandra-connector_2.11;2.0.0-M3 in central 17/02/28 10:47:44 INFO ContextLauncher: found io.netty#netty-all;4.0.33.Final in central 17/02/28 10:47:44 INFO ContextLauncher: found commons-beanutils#commons-beanutils;1.8.0 in central 17/02/28 10:47:44 INFO ContextLauncher: found joda-time#joda-time;2.3 in central 17/02/28 10:47:44 INFO ContextLauncher: found com.twitter#jsr166e;1.1.0 in central 17/02/28 10:47:44 INFO ContextLauncher: found org.joda#joda-convert;1.2 in central 17/02/28 10:47:44 INFO ContextLauncher: found org.scala-lang#scala-reflect;2.11.8 in central 17/02/28 10:47:44 INFO ContextLauncher: found org.mongodb.spark#mongo-spark-connector_2.10;2.0.0 in central 17/02/28 10:47:44 INFO ContextLauncher: found org.mongodb#mongo-java-driver;3.2.2 in central 17/02/28 10:47:44 INFO ContextLauncher: :: resolution report :: resolve 612ms :: artifacts dl 25ms 17/02/28 10:47:44 INFO ContextLauncher: :: modules in use: 17/02/28 10:47:44 INFO ContextLauncher: com.datastax.spark#spark-cassandra-connector_2.11;2.0.0-M3 from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: com.twitter#jsr166e;1.1.0 from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: commons-beanutils#commons-beanutils;1.8.0 from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: io.netty#netty-all;4.0.33.Final from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: joda-time#joda-time;2.3 from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: org.joda#joda-convert;1.2 from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: org.mongodb#mongo-java-driver;3.2.2 from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: org.mongodb.spark#mongo-spark-connector_2.10;2.0.0 from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: org.scala-lang#scala-reflect;2.11.8 from central in [default] 17/02/28 10:47:44 INFO ContextLauncher: --------------------------------------------------------------------- 17/02/28 10:47:44 INFO ContextLauncher: | | modules || artifacts | 17/02/28 10:47:44 INFO ContextLauncher: | conf | number| search|dwnlded|evicted|| number|dwnlded| 17/02/28 10:47:44 INFO ContextLauncher: --------------------------------------------------------------------- 17/02/28 10:47:44 INFO ContextLauncher: | default | 9 | 0 | 0 | 0 || 9 | 0 | 17/02/28 10:47:44 INFO ContextLauncher: --------------------------------------------------------------------- 17/02/28 10:47:44 INFO ContextLauncher: :: retrieving :: org.apache.spark#spark-submit-parent 17/02/28 10:47:44 INFO ContextLauncher: confs: [default] 17/02/28 10:47:44 INFO ContextLauncher: 0 artifacts copied, 9 already retrieved (0kB/16ms) 17/02/28 10:47:45 INFO ContextLauncher: 17/02/28 10:47:45 INFO RSCDriver: Connecting to: ip-10-200-139-129.ec2.internal:39248 17/02/28 10:47:45 INFO ContextLauncher: 17/02/28 10:47:45 INFO RSCDriver: Starting RPC server... 17/02/28 10:47:45 INFO ContextLauncher: 17/02/28 10:47:45 WARN RSCConf: Your hostname, ip-10-200-139-129.ec2.internal, resolves to a loopback address, but we couldn't find any external IP address! 17/02/28 10:47:45 INFO ContextLauncher: 17/02/28 10:47:45 WARN RSCConf: Set livy.rsc.rpc.server.address if you need to bind to another address. 17/02/28 10:47:45 INFO ContextLauncher: 17/02/28 10:47:45 INFO RSCDriver: Received job request af9b1cc5-7f51-49dd-9d29-9a5a7ed611ad 17/02/28 10:47:45 INFO ContextLauncher: 17/02/28 10:47:45 INFO RSCDriver: SparkContext not yet up, queueing job request. 17/02/28 10:47:50 INFO ContextLauncher: 17/02/28 10:47:50 INFO SparkContext: Running Spark version 2.0.2 17/02/28 10:47:50 INFO ContextLauncher: 17/02/28 10:47:50 INFO SecurityManager: Changing view acls to: hadoop 17/02/28 10:47:50 INFO ContextLauncher: 17/02/28 10:47:50 INFO SecurityManager: Changing modify acls to: hadoop 17/02/28 10:47:50 INFO ContextLauncher: 17/02/28 10:47:50 INFO SecurityManager: Changing view acls groups to: 17/02/28 10:47:50 INFO ContextLauncher: 17/02/28 10:47:50 INFO SecurityManager: Changing modify acls groups to: 17/02/28 10:47:50 INFO ContextLauncher: 17/02/28 10:47:50 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set() 17/02/28 10:47:50 INFO ContextLauncher: 17/02/28 10:47:50 INFO Utils: Successfully started service 'sparkDriver' on port 41666. 17/02/28 10:47:50 INFO ContextLauncher: 17/02/28 10:47:50 INFO SparkEnv: Registering MapOutputTracker 17/02/28 10:47:51 INFO ContextLauncher: 17/02/28 10:47:51 INFO SparkEnv: Registering BlockManagerMaster 17/02/28 10:47:51 INFO ContextLauncher: 17/02/28 10:47:51 INFO DiskBlockManager: Created local directory at /mnt/tmp/blockmgr-04a5f128-fb19-4ab5-adf1-ab7d2b61b6da 17/02/28 10:47:51 INFO ContextLauncher: 17/02/28 10:47:51 INFO MemoryStore: MemoryStore started with capacity 414.4 MB 17/02/28 10:47:51 INFO ContextLauncher: 17/02/28 10:47:51 INFO SparkEnv: Registering OutputCommitCoordinator 17/02/28 10:47:51 INFO ContextLauncher: 17/02/28 10:47:51 INFO Utils: Successfully started service 'SparkUI' on port 4040. 17/02/28 10:47:51 INFO ContextLauncher: 17/02/28 10:47:51 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.200.139.129:4040 17/02/28 10:47:51 INFO ContextLauncher: 17/02/28 10:47:51 INFO Utils: Using initial executors = 1, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances 17/02/28 10:47:52 INFO ContextLauncher: 17/02/28 10:47:52 INFO RMProxy: Connecting to ResourceManager at ip-10-200-139-129.ec2.internal/10.200.139.129:8032 17/02/28 10:47:52 INFO ContextLauncher: 17/02/28 10:47:52 INFO Client: Requesting a new application from cluster with 2 NodeManagers 17/02/28 10:47:52 INFO ContextLauncher: 17/02/28 10:47:52 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (11520 MB per container) 17/02/28 10:47:52 INFO ContextLauncher: 17/02/28 10:47:52 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 17/02/28 10:47:52 INFO ContextLauncher: 17/02/28 10:47:52 INFO Client: Setting up container launch context for our AM 17/02/28 10:47:52 INFO ContextLauncher: 17/02/28 10:47:52 INFO Client: Setting up the launch environment for our AM container 17/02/28 10:47:52 INFO ContextLauncher: 17/02/28 10:47:52 INFO Client: Preparing resources for our AM container 17/02/28 10:47:52 INFO ContextLauncher: 17/02/28 10:47:52 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 17/02/28 10:47:55 INFO ContextLauncher: 17/02/28 10:47:55 INFO Client: Uploading resource file:/mnt/tmp/spark-80f8ede5-e1cc-46a2-bc2a-38be26a95933/__spark_libs__3027229192936044972.zip -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/__spark_libs__3027229192936044972.zip 17/02/28 10:47:56 INFO ContextLauncher: 17/02/28 10:47:56 INFO Client: Uploading resource file:/home/hadoop/livy_max/build/livy/rsc/target/jars/netty-all-4.0.29.Final.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/netty-all-4.0.29.Final.jar 17/02/28 10:47:56 INFO ContextLauncher: 17/02/28 10:47:56 INFO Client: Uploading resource file:/home/hadoop/livy_max/build/livy/rsc/target/jars/livy-rsc-0.3.1-SNAPSHOT.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/livy-rsc-0.3.1-SNAPSHOT.jar 17/02/28 10:47:56 INFO ContextLauncher: 17/02/28 10:47:56 INFO Client: Uploading resource file:/home/hadoop/livy_max/build/livy/rsc/target/jars/livy-api-0.3.1-SNAPSHOT.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/livy-api-0.3.1-SNAPSHOT.jar 17/02/28 10:47:56 INFO ContextLauncher: 17/02/28 10:47:56 INFO Client: Uploading resource file:/home/hadoop/livy_max/build/livy/repl/scala-2.11/target/jars/commons-codec-1.9.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/commons-codec-1.9.jar 17/02/28 10:47:56 INFO ContextLauncher: 17/02/28 10:47:56 INFO Client: Uploading resource file:/home/hadoop/livy_max/build/livy/repl/scala-2.11/target/jars/livy-repl_2.11-0.3.1-SNAPSHOT.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/livy-repl_2.11-0.3.1-SNAPSHOT.jar 17/02/28 10:47:56 INFO ContextLauncher: 17/02/28 10:47:56 INFO Client: Uploading resource file:/home/hadoop/livy_max/build/livy/repl/scala-2.11/target/jars/livy-core_2.11-0.3.1-SNAPSHOT.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/livy-core_2.11-0.3.1-SNAPSHOT.jar 17/02/28 10:47:56 INFO ContextLauncher: 17/02/28 10:47:56 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/com.datastax.spark_spark-cassandra-connector_2.11-2.0.0-M3.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/com.datastax.spark_spark-cassandra-connector_2.11-2.0.0-M3.jar 17/02/28 10:47:56 INFO ContextLauncher: 17/02/28 10:47:56 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/org.mongodb.spark_mongo-spark-connector_2.10-2.0.0.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/org.mongodb.spark_mongo-spark-connector_2.10-2.0.0.jar 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/io.netty_netty-all-4.0.33.Final.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/io.netty_netty-all-4.0.33.Final.jar 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/commons-beanutils_commons-beanutils-1.8.0.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/commons-beanutils_commons-beanutils-1.8.0.jar 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/joda-time_joda-time-2.3.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/joda-time_joda-time-2.3.jar 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/com.twitter_jsr166e-1.1.0.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/com.twitter_jsr166e-1.1.0.jar 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/org.joda_joda-convert-1.2.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/org.joda_joda-convert-1.2.jar 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/org.scala-lang_scala-reflect-2.11.8.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/org.scala-lang_scala-reflect-2.11.8.jar 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Uploading resource file:/home/hadoop/.ivy2/jars/org.mongodb_mongo-java-driver-3.2.2.jar -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/org.mongodb_mongo-java-driver-3.2.2.jar 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Uploading resource file:/mnt/tmp/spark-80f8ede5-e1cc-46a2-bc2a-38be26a95933/__spark_conf__7559040700142515392.zip -> hdfs://ip-10-200-139-129.ec2.internal:8020/user/hadoop/.sparkStaging/application_1482343367445_0211/__spark_conf__.zip 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO SecurityManager: Changing view acls to: hadoop 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO SecurityManager: Changing modify acls to: hadoop 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO SecurityManager: Changing view acls groups to: 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO SecurityManager: Changing modify acls groups to: 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set() 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO Client: Submitting application application_1482343367445_0211 to ResourceManager 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO YarnClientImpl: Submitted application application_1482343367445_0211 17/02/28 10:47:57 INFO ContextLauncher: 17/02/28 10:47:57 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1482343367445_0211 and attemptId None 17/02/28 10:47:58 INFO ContextLauncher: 17/02/28 10:47:58 INFO Client: Application report for application_1482343367445_0211 (state: ACCEPTED) 17/02/28 10:47:58 INFO ContextLauncher: 17/02/28 10:47:58 INFO Client: 17/02/28 10:47:58 INFO ContextLauncher: client token: N/A 17/02/28 10:47:58 INFO ContextLauncher: diagnostics: N/A 17/02/28 10:47:58 INFO ContextLauncher: ApplicationMaster host: N/A 17/02/28 10:47:58 INFO ContextLauncher: ApplicationMaster RPC port: -1 17/02/28 10:47:58 INFO ContextLauncher: queue: default 17/02/28 10:47:58 INFO ContextLauncher: start time: 1488278877511 17/02/28 10:47:58 INFO ContextLauncher: final status: UNDEFINED 17/02/28 10:47:58 INFO ContextLauncher: tracking URL: http://ip-10-200-139-129.ec2.internal:20888/proxy/application_1482343367445_0211/ 17/02/28 10:47:58 INFO ContextLauncher: user: hadoop 17/02/28 10:47:59 INFO ContextLauncher: 17/02/28 10:47:59 INFO Client: Application report for application_1482343367445_0211 (state: ACCEPTED) 17/02/28 10:48:00 INFO ContextLauncher: 17/02/28 10:48:00 INFO Client: Application report for application_1482343367445_0211 (state: ACCEPTED) 17/02/28 10:48:01 INFO ContextLauncher: 17/02/28 10:48:01 INFO Client: Application report for application_1482343367445_0211 (state: ACCEPTED) 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null) 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> ip-10-200-139-129.ec2.internal, PROXY_URI_BASES -> http://ip-10-200-139-129.ec2.internal:20888/proxy/application_1482343367445_0211), /proxy/application_1482343367445_0211 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO Client: Application report for application_1482343367445_0211 (state: RUNNING) 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO Client: 17/02/28 10:48:02 INFO ContextLauncher: client token: N/A 17/02/28 10:48:02 INFO ContextLauncher: diagnostics: N/A 17/02/28 10:48:02 INFO ContextLauncher: ApplicationMaster host: 10.200.139.13 17/02/28 10:48:02 INFO ContextLauncher: ApplicationMaster RPC port: 0 17/02/28 10:48:02 INFO ContextLauncher: queue: default 17/02/28 10:48:02 INFO ContextLauncher: start time: 1488278877511 17/02/28 10:48:02 INFO ContextLauncher: final status: UNDEFINED 17/02/28 10:48:02 INFO ContextLauncher: tracking URL: http://ip-10-200-139-129.ec2.internal:20888/proxy/application_1482343367445_0211/ 17/02/28 10:48:02 INFO ContextLauncher: user: hadoop 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO YarnClientSchedulerBackend: Application application_1482343367445_0211 has started running. 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43525. 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO NettyBlockTransferService: Server created on 10.200.139.129:43525 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO BlockManager: external shuffle service port = 7337 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.200.139.129, 43525) 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO BlockManagerMasterEndpoint: Registering block manager 10.200.139.129:43525 with 414.4 MB RAM, BlockManagerId(driver, 10.200.139.129, 43525) 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.200.139.129, 43525) 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO EventLoggingListener: Logging events to hdfs:///var/log/spark/apps/application_1482343367445_0211 17/02/28 10:48:02 INFO ContextLauncher: 17/02/28 10:48:02 INFO Utils: Using initial executors = 1, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null) (10.200.139.13:35662) with ID 1 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO ExecutorAllocationManager: New executor 1 has registered (new total is 1) 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect. 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO BlockManagerMasterEndpoint: Registering block manager ip-10-200-139-13.ec2.internal:40663 with 2.8 GB RAM, BlockManagerId(1, ip-10-200-139-13.ec2.internal, 40663) 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO SharedState: Warehouse path is 'hdfs:/user/spark/warehouse'. 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO SparkInterpreter: Created Spark session. 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO SparkUI: Stopped Spark web UI at http://10.200.139.129:4040 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO YarnClientSchedulerBackend: Interrupting monitor thread 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO YarnClientSchedulerBackend: Shutting down all executors 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices 17/02/28 10:48:05 INFO ContextLauncher: (serviceOption=None, 17/02/28 10:48:05 INFO ContextLauncher: services=List(), 17/02/28 10:48:05 INFO ContextLauncher: started=false) 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO YarnClientSchedulerBackend: Stopped 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO MemoryStore: MemoryStore cleared 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO BlockManager: BlockManager stopped 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO BlockManagerMaster: BlockManagerMaster stopped 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/02/28 10:48:05 INFO ContextLauncher: 17/02/28 10:48:05 INFO SparkContext: Successfully stopped SparkContext 17/02/28 10:48:05 WARN RSCClient: Client RPC channel closed unexpectedly. 17/02/28 10:48:05 INFO ContextLauncher: Exception in thread "main" scala.reflect.internal.FatalError: object Predef does not have a member classOf 17/02/28 10:48:05 WARN RSCClient: Error stopping RPC. io.netty.util.concurrent.BlockingOperationException: DefaultChannelPromise@3da3f98d(uncancellable) at io.netty.util.concurrent.DefaultPromise.checkDeadLock(DefaultPromise.java:390) at io.netty.channel.DefaultChannelPromise.checkDeadLock(DefaultChannelPromise.java:157) at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:251) at io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:129) at io.netty.channel.DefaultChannelPromise.await(DefaultChannelPromise.java:28) at io.netty.util.concurrent.DefaultPromise.sync(DefaultPromise.java:218) at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:117) at io.netty.channel.DefaultChannelPromise.sync(DefaultChannelPromise.java:28) at com.cloudera.livy.rsc.rpc.Rpc.close(Rpc.java:307) at com.cloudera.livy.rsc.RSCClient.stop(RSCClient.java:225) at com.cloudera.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:122) at com.cloudera.livy.rsc.RSCClient$2$1.onSuccess(RSCClient.java:116) at com.cloudera.livy.rsc.Utils$2.operationComplete(Utils.java:108) at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680) at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:567) at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:406) at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:82) at io.netty.channel.AbstractChannel$CloseFuture.setClosed(AbstractChannel.java:956) at io.netty.channel.AbstractChannel$AbstractUnsafe.doClose0(AbstractChannel.java:608) at io.netty.channel.AbstractChannel$AbstractUnsafe.close(AbstractChannel.java:586) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.closeOnRead(AbstractNioByteChannel.java:71) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:158) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) at java.lang.Thread.run(Thread.java:745) 17/02/28 10:48:05 INFO RSCClient: Failing pending job af9b1cc5-7f51-49dd-9d29-9a5a7ed611ad due to shutdown. 17/02/28 10:48:05 INFO ContextLauncher: at scala.reflect.internal.Definitions$DefinitionsClass.scala$reflect$internal$Definitions$DefinitionsClass$$fatalMissingSymbol(Definitions.scala:1186) 17/02/28 10:48:05 INFO ContextLauncher: at scala.reflect.internal.Definitions$DefinitionsClass.getMember(Definitions.scala:1203) 17/02/28 10:48:05 INFO ContextLauncher: at scala.reflect.internal.Definitions$DefinitionsClass.getMemberMethod(Definitions.scala:1238) 17/02/28 10:48:05 INFO ContextLauncher: at scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.Predef_classOf$lzycompute(Definitions.scala:1469) 17/02/28 10:48:05 INFO ContextLauncher: at scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.Predef_classOf(Definitions.scala:1469) 17/02/28 10:48:05 INFO ContextLauncher: at scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.isPredefClassOf(Definitions.scala:1459) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typedIdent$2(Typers.scala:4885) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typedIdentOrWildcard$1(Typers.scala:4908) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5340) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5360) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5396) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5423) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5370) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5374) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:36) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5472) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5480) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typedPackageDef$1(Typers.scala:5012) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5312) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5359) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5396) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5423) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5370) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5374) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:36) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5448) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.apply(Analyzer.scala:102) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.Global$GlobalPhase$$anonfun$applyPhase$1.apply$mcV$sp(Global.scala:440) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.Global$GlobalPhase.withCurrentUnit(Global.scala:431) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:440) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:94) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:93) 17/02/28 10:48:05 INFO ContextLauncher: at scala.collection.Iterator$class.foreach(Iterator.scala:893) 17/02/28 10:48:05 INFO ContextLauncher: at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.run(Analyzer.scala:93) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1501) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1486) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.Global$Run.compileSources(Global.scala:1481) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:435) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:855) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:813) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:675) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkInterpreter$$anonfun$bind$1.apply(SparkInterpreter.scala:129) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkInterpreter$$anonfun$bind$1.apply(SparkInterpreter.scala:129) 17/02/28 10:48:05 INFO ContextLauncher: at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkInterpreter.bind(SparkInterpreter.scala:128) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkContextInitializer$class.spark2CreateContext(SparkContextInitializer.scala:109) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkContextInitializer$class.createSparkContext(SparkContextInitializer.scala:34) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkInterpreter.createSparkContext(SparkInterpreter.scala:36) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:89) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:256) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:68) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:76) 17/02/28 10:48:05 INFO ContextLauncher: at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:74) 17/02/28 10:48:05 INFO ContextLauncher: at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) 17/02/28 10:48:05 INFO ContextLauncher: at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) 17/02/28 10:48:05 INFO ContextLauncher: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 17/02/28 10:48:05 INFO ContextLauncher: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 17/02/28 10:48:05 INFO ContextLauncher: at java.lang.Thread.run(Thread.java:745) 17/02/28 10:48:05 INFO InteractiveSession: Stopping InteractiveSession 0... 17/02/28 10:48:05 INFO InteractiveSession: Failed to ping RSC driver for session 0. Killing application. 17/02/28 10:48:06 INFO ContextLauncher: 17/02/28 10:48:06 INFO ShutdownHookManager: Shutdown hook called 17/02/28 10:48:06 INFO ContextLauncher: 17/02/28 10:48:06 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-80f8ede5-e1cc-46a2-bc2a-38be26a95933 17/02/28 10:48:06 INFO YarnClientImpl: Killed application application_1482343367445_0211 17/02/28 10:48:06 INFO InteractiveSession: Stopped InteractiveSession 0. 17/02/28 10:48:06 WARN ContextLauncher: Child process exited with code 1. 17/02/28 10:48:25 INFO LivyServer: Shutting down Livy server. 17/02/28 10:48:25 INFO InteractiveSession: Stopping InteractiveSession 0... 17/02/28 10:48:25 INFO InteractiveSession: Stopped InteractiveSession 0.