Details
-
Type:
Bug
-
Status: Open
-
Priority:
Major
-
Resolution: Unresolved
-
Affects Version/s: 0.3.0
-
Fix Version/s: None
-
Component/s: None
-
Labels:
Description
This was found in the stress test on Bolt 80. I'm unclear as to whether this is expected behavior. Li as suggested adding --master=yarn-client to the spark submit as a possible fix
[alex.leblang@e1102 PersonalScripts]$ spark-submit --class com.cloudera.recordservice.examples.spark.RecordCount --properties-file /etc/recordservice/conf/spark.conf /home/alex.leblang/recordservice-client-0.3.0-cdh5.7.x/lib/recordservice-examples-spark-0.3.0-cdh5.7.x.jar "select count(*) from tpcds_50_text.store_sales" 16/04/13 18:20:00 INFO spark.SparkContext: Running Spark version 1.6.0 16/04/13 18:20:13 INFO spark.SecurityManager: Changing view acls to: alex.leblang 16/04/13 18:20:13 INFO spark.SecurityManager: Changing modify acls to: alex.leblang 16/04/13 18:20:13 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(alex.leblang); users with modify permissions: Set(alex.leblang) 16/04/13 18:20:16 INFO util.Utils: Successfully started service 'sparkDriver' on port 55001. 16/04/13 18:20:20 INFO slf4j.Slf4jLogger: Slf4jLogger started 16/04/13 18:20:21 INFO Remoting: Starting remoting 16/04/13 18:20:22 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.20.126.102:44466] 16/04/13 18:20:22 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriverActorSystem@10.20.126.102:44466] 16/04/13 18:20:22 INFO util.Utils: Successfully started service 'sparkDriverActorSystem' on port 44466. 16/04/13 18:20:22 INFO spark.SparkEnv: Registering MapOutputTracker 16/04/13 18:20:22 INFO spark.SparkEnv: Registering BlockManagerMaster 16/04/13 18:20:22 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-de8e5436-587d-4064-9ded-3b25cae306eb 16/04/13 18:20:22 INFO storage.MemoryStore: MemoryStore started with capacity 530.3 MB 16/04/13 18:20:23 INFO spark.SparkEnv: Registering OutputCommitCoordinator 16/04/13 18:20:25 WARN thread.QueuedThreadPool: 1 threads could not be stopped 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041. 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042. 16/04/13 18:20:26 WARN util.Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4045. Attempting port 4046. 16/04/13 18:20:27 WARN thread.QueuedThreadPool: 3 threads could not be stopped 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4046. Attempting port 4047. 16/04/13 18:20:27 WARN util.Utils: Service 'SparkUI' could not bind on port 4047. Attempting port 4048. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4048. Attempting port 4049. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4049. Attempting port 4050. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4050. Attempting port 4051. 16/04/13 18:20:28 WARN util.Utils: Service 'SparkUI' could not bind on port 4051. Attempting port 4052. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4052. Attempting port 4053. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4053. Attempting port 4054. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4054. Attempting port 4055. 16/04/13 18:20:29 WARN util.Utils: Service 'SparkUI' could not bind on port 4055. Attempting port 4056. 16/04/13 18:20:30 ERROR ui.SparkUI: Failed to bind SparkUI java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries! at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:444) at sun.nio.ch.Net.bind(Net.java:436) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187) at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316) at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.spark-project.jetty.server.Server.doStart(Server.java:293) at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64) at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:283) at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:293) at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:293) at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1989) at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1980) at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:293) at org.apache.spark.ui.WebUI.bind(WebUI.scala:137) at org.apache.spark.SparkContext$$anonfun$14.apply(SparkContext.scala:492) at org.apache.spark.SparkContext$$anonfun$14.apply(SparkContext.scala:492) at scala.Option.foreach(Option.scala:236) at org.apache.spark.SparkContext.<init>(SparkContext.scala:492) at com.cloudera.recordservice.examples.spark.RecordCount$.main(RecordCount.scala:51) at com.cloudera.recordservice.examples.spark.RecordCount.main(RecordCount.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/04/13 18:20:30 INFO storage.DiskBlockManager: Shutdown hook called 16/04/13 18:20:30 INFO util.ShutdownHookManager: Shutdown hook called 16/04/13 18:20:30 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-df684e69-7a81-4a0f-917a-6f214dffb79c 16/04/13 18:20:30 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-df684e69-7a81-4a0f-917a-6f214dffb79c/userFiles-d5524d6b-79c4-44b4-9438-c10630711404 [alex.leblang@e1102 PersonalScripts]$