You are viewing a plain text version of this content. The canonical link for it is here.
Posted to yarn-dev@hadoop.apache.org by Arun Ahuja <aa...@gmail.com> on 2013/04/29 21:46:11 UTC

Re: Running Spark on YARN example

Anyone else tried this demo out, trying the yarn dev group as well.

Thanks

Arun


On Thu, Apr 25, 2013 at 10:24 AM, Arun Ahuja <aa...@gmail.com> wrote:

> I'm trying to run this example here:
>
> http://www.spark-project.org/docs/0.6.0/running-on-yarn.html
>
> and running into a few problems.
>
> Here is the command run:
>
> SPARK_JAR=./core/target/scala-2.9.3/spark-core_2.9.3-0.8.0-SNAPSHOT.jar
> ./run spark.deploy.yarn.Client --jar
> examples/target/scala-2.9.3/spark-examples_2.9.3-0.8.0-SNAPSHOT.jar --class
> spark.examples.SparkPi
>
> and seems to connect to YARN and submit the job, but then not find the app
> jar.  Seems to be something rename the app jar?  Do I have to move it to
> each cluster node manually?
>
> 13/04/25 10:13:38 INFO yarn.Client: Requesting new Application
> 13/04/25 10:13:38 INFO yarn.Client: Got new ApplicationId:
> application_1366863580429_0009
> 13/04/25 10:13:38 INFO yarn.Client: Max mem capabililty of resources in
> this cluster 8192
> 13/04/25 10:13:38 INFO yarn.Client: Setting up application submission
> context for ASM
> 13/04/25 10:13:38 INFO yarn.Client: Preparing Local resources
> 13/04/25 10:13:38 INFO yarn.Client: Uploading
> core/target/scala-2.9.3/spark-core_2.9.3-0.8.0-SNAPSHOT.jar to
> file:/home/arun/spark/9spark.jar
> 13/04/25 10:13:38 INFO yarn.Client: Uploading
> examples/target/scala-2.9.3/spark-examples_2.9.3-0.8.0-SNAPSHOT.jar to
> file:/home/arun/spark/9app.jar
> 13/04/25 10:13:38 INFO yarn.Client: Setting up the launch environment
> 13/04/25 10:13:38 INFO yarn.Client: Setting up container launch context
> 13/04/25 10:13:38 INFO yarn.Client: Command for the ApplicationMaster:
> java  -server -Xmx640m  spark.deploy.yarn.ApplicationMaster --class
> spark.examples.SparkPi --jar
> examples/target/scala-2.9.3/spark-examples_2.9.3-0.8.0-SNAPSHOT.jar
> --worker-memory 1024 --worker-cores 1 --num-workers 2 1> <LOG_DIR>/stdout
> 2> <LOG_DIR>/stderr
> 13/04/25 10:13:38 INFO yarn.Client: Submitting application to ASM
> 13/04/25 10:13:39 INFO yarn.Client: Application report from ASM:
>          application identifier: application_1366863580429_0009
>          appId: 9
>          clientToken: null
>          appDiagnostics:
>          appMasterHost: N/A
>          appQueue: default
>          appMasterRpcPort: 0
>          appStartTime: 1366899218546
>          yarnAppState: null
>          distributedFinalState: UNDEFINED
>          appTrackingUrl:
> hdp47.303net.pvt:8088/proxy/application_1366863580429_0009/
>          appUser: arun
> 13/04/25 10:13:40 INFO yarn.Client: Application report from ASM:
>          application identifier: application_1366863580429_0009
>          appId: 9
>          clientToken: null
>          appDiagnostics: Application application_1366863580429_0009 failed
> 1 times due to AM Container for appattempt_1366863580429_0009_000001 exited
> with  exitCode: -1000 due to: RemoteTrace:
> java.io.FileNotFoundException: File file:/home/arun/spark/9app.jar does
> not exist
>