Details
-
Type:
Bug
-
Status: Open
-
Priority:
Blocker
-
Resolution: Unresolved
-
Affects Version/s: CDH 5.2.0
-
Fix Version/s: None
-
Component/s: Spark
-
Labels:
-
Environment:RHEL 6.1 (Santiago)
Description
This works fine:
$ bin/spark-submit --class org.apache.spark.examples.SparkPi --master local[2] examples/lib/spark-examples-1.1.0-cdh5.2.0-hadoop2.5.0-cdh5.2.0.jar 20
But this fails: Actually hangs at 10%
$ bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn examples/lib/spark-examples-1.1.0-cdh5.2.0-hadoop2.5.0-cdh5.2.0.jar 20
no errors anywhere (logs, console, ui).
on ui progress bars stays at 10% forever.
on the console, the last statement repeats forever.
...
...
14/10/16 19:31:56 INFO YarnClientClusterScheduler: Adding task set 0.0 with 20 tasks
{--deploy-mode client}
14/10/16 19:32:11 WARN YarnClientClusterScheduler: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
{--deploy-mode cluster}
14/10/16 19:26:48 INFO Client: Application report from ResourceManager:
application identifier: application_1413360029759_0006
appId: 6
clientToAMToken: null
appDiagnostics:
appMasterHost: myhost
appQueue: root.root
appMasterRpcPort: 0
appStartTime: 1413512305511
yarnAppState: RUNNING
distributedFinalState: UNDEFINED
appTrackingUrl: http://myhost:8088/proxy/application_1413360029759_0006/
appUser: root