You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Eric Kimbrel <le...@gmail.com> on 2014/01/08 04:16:21 UTC

Spark on Yarn classpath problems

I am trying to run spark version 0.8.1 on hadoop 2.2.0-cdh5.0.0-beta-1 with YARN.  

I am using YARN Client with yarn-standalone mode as described here http://spark.incubator.apache.org/docs/latest/running-on-yarn.html

For simplifying matters I’ll say my application code is all contained in application.jar and it additionally depends on on code in dependency.jar

I launch my spark application as follows:
	

SPARK_JAR=<SPARK_ASSEMBLY_JAR_FILE> ./spark-class org.apache.spark.deploy.yarn.Client \
  --jar application.jar \
  --class <My main class> \
  --args <app specific arguments> \
  --num-workers <NUMBER_OF_WORKER_MACHINES> \
  --master-memory <MEMORY_FOR_MASTER> \
  --worker-memory <MEMORY_PER_WORKER> \
  --worker-cores <CORES_PER_WORKER> \	
  --name <application_name> \
  --addJars dependency.jar


Yarn loads the job and starts to execute, but as the job runs it quickly dies on class not found exceptions for classes that are specified in dependency.jar.

As an attempted fix i tried including all of the dependencies into a single jar “application-with-dependencies.jar”  I specify this jar with —jar option and remove the —addJars line.  Unfortunately this did not alleviate the issue and the class not found exceptions continued.



RE: Spark on Yarn classpath problems

Posted by "Liu, Raymond" <ra...@intel.com>.
1. --files should be enough.
2. --files will read and distribute it onto the cluster. And you can also put the file on hdfs and point to it to save the time for uploading, though still need to be download to worker container ( will be done by yarn container automatically)


Best Regards,
Raymond Liu


-----Original Message-----
From: Eric Kimbrel [mailto:lekimbrel@gmail.com] 

Interesting.  I'll inspect the assembly and take a look, but i have a couple of follow up questions,

1.  If the class is needed both in the sparkContext thread and the workers would i need to add it twice? with -addJars and with -file?
2.  with the -file method will i need to place the jar at that location on each node of the cluster or does the yarn client read the file and distribute it onto the cluster?

Thanks response.



On Jan 7, 2014, at 8:44 PM, Liu, Raymond <ra...@intel.com> wrote:

> Not found in which part of code? If in sparkContext thread, say on AM, 
> --addJars should work
> 
> If on tasks, then --addjars won't work, you need to use --file=local://xxx etc, not sure is it available in 0.8.1. And adding to a single jar should also work, if not works, might be something wrong with the assemble? 
> 
> Best Regards,
> Raymond Liu
> 
> From: Eric Kimbrel [mailto:lekimbrel@gmail.com]
> Sent: Wednesday, January 08, 2014 11:16 AM
> To: user@spark.incubator.apache.org
> Subject: Spark on Yarn classpath problems
> 
> I am trying to run spark version 0.8.1 on hadoop 2.2.0-cdh5.0.0-beta-1 with YARN.  
> 
> I am using YARN Client with yarn-standalone mode as described here 
> http://spark.incubator.apache.org/docs/latest/running-on-yarn.html
> 
> For simplifying matters I'll say my application code is all contained 
> in application.jar and it additionally depends on on code in 
> dependency.jar
> 
> I launch my spark application as follows:
> 	
> 
> SPARK_JAR=<SPARK_ASSEMBLY_JAR_FILE> ./spark-class 
> org.apache.spark.deploy.yarn.Client \  --jar application.jar \  
> --class <My main class> \  --args <app specific arguments> \  
> --num-workers <NUMBER_OF_WORKER_MACHINES> \  --master-memory 
> <MEMORY_FOR_MASTER> \  --worker-memory <MEMORY_PER_WORKER> \
>  --worker-cores <CORES_PER_WORKER> \	
>  --name <application_name> \
>  --addJars dependency.jar
> 
> 
> Yarn loads the job and starts to execute, but as the job runs it quickly dies on class not found exceptions for classes that are specified in dependency.jar.
> 
> As an attempted fix i tried including all of the dependencies into a single jar "application-with-dependencies.jar"  I specify this jar with -jar option and remove the -addJars line.  Unfortunately this did not alleviate the issue and the class not found exceptions continued.
> 
> 


Re: Spark on Yarn classpath problems

Posted by Eric Kimbrel <le...@gmail.com>.
Interesting.  I’ll inspect the assembly and take a look, but i have a couple of follow up questions,

1.  If the class is needed both in the sparkContext thread and the workers would i need to add it twice? with —addJars and with —file?
2.  with the —file method will i need to place the jar at that location on each node of the cluster or does the yarn client read the file and distribute it onto the cluster?

Thanks response.



On Jan 7, 2014, at 8:44 PM, Liu, Raymond <ra...@intel.com> wrote:

> Not found in which part of code? If in sparkContext thread, say on AM, --addJars should work
> 
> If on tasks, then --addjars won't work, you need to use --file=local://xxx etc, not sure is it available in 0.8.1. And adding to a single jar should also work, if not works, might be something wrong with the assemble? 
> 
> Best Regards,
> Raymond Liu
> 
> From: Eric Kimbrel [mailto:lekimbrel@gmail.com] 
> Sent: Wednesday, January 08, 2014 11:16 AM
> To: user@spark.incubator.apache.org
> Subject: Spark on Yarn classpath problems
> 
> I am trying to run spark version 0.8.1 on hadoop 2.2.0-cdh5.0.0-beta-1 with YARN.  
> 
> I am using YARN Client with yarn-standalone mode as described here http://spark.incubator.apache.org/docs/latest/running-on-yarn.html
> 
> For simplifying matters I'll say my application code is all contained in application.jar and it additionally depends on on code in dependency.jar
> 
> I launch my spark application as follows:
> 	
> 
> SPARK_JAR=<SPARK_ASSEMBLY_JAR_FILE> ./spark-class org.apache.spark.deploy.yarn.Client \
>  --jar application.jar \
>  --class <My main class> \
>  --args <app specific arguments> \
>  --num-workers <NUMBER_OF_WORKER_MACHINES> \
>  --master-memory <MEMORY_FOR_MASTER> \
>  --worker-memory <MEMORY_PER_WORKER> \
>  --worker-cores <CORES_PER_WORKER> \	
>  --name <application_name> \
>  --addJars dependency.jar
> 
> 
> Yarn loads the job and starts to execute, but as the job runs it quickly dies on class not found exceptions for classes that are specified in dependency.jar.
> 
> As an attempted fix i tried including all of the dependencies into a single jar "application-with-dependencies.jar"  I specify this jar with -jar option and remove the -addJars line.  Unfortunately this did not alleviate the issue and the class not found exceptions continued.
> 
> 


RE: Spark on Yarn classpath problems

Posted by "Liu, Raymond" <ra...@intel.com>.
Not found in which part of code? If in sparkContext thread, say on AM, --addJars should work

If on tasks, then --addjars won't work, you need to use --file=local://xxx etc, not sure is it available in 0.8.1. And adding to a single jar should also work, if not works, might be something wrong with the assemble? 

Best Regards,
Raymond Liu

From: Eric Kimbrel [mailto:lekimbrel@gmail.com] 
Sent: Wednesday, January 08, 2014 11:16 AM
To: user@spark.incubator.apache.org
Subject: Spark on Yarn classpath problems

I am trying to run spark version 0.8.1 on hadoop 2.2.0-cdh5.0.0-beta-1 with YARN.  

I am using YARN Client with yarn-standalone mode as described here http://spark.incubator.apache.org/docs/latest/running-on-yarn.html

For simplifying matters I'll say my application code is all contained in application.jar and it additionally depends on on code in dependency.jar

I launch my spark application as follows:
	

SPARK_JAR=<SPARK_ASSEMBLY_JAR_FILE> ./spark-class org.apache.spark.deploy.yarn.Client \
  --jar application.jar \
  --class <My main class> \
  --args <app specific arguments> \
  --num-workers <NUMBER_OF_WORKER_MACHINES> \
  --master-memory <MEMORY_FOR_MASTER> \
  --worker-memory <MEMORY_PER_WORKER> \
  --worker-cores <CORES_PER_WORKER> \	
  --name <application_name> \
  --addJars dependency.jar


Yarn loads the job and starts to execute, but as the job runs it quickly dies on class not found exceptions for classes that are specified in dependency.jar.

As an attempted fix i tried including all of the dependencies into a single jar "application-with-dependencies.jar"  I specify this jar with -jar option and remove the -addJars line.  Unfortunately this did not alleviate the issue and the class not found exceptions continued.