You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Hu...@Dell.com on 2014/01/21 01:44:06 UTC
Error: Could not find or load main class
org.apache.spark.executor.CoarseGrainedExecutorBackend
Hi,
I am using spark 0.8.0 when hadoop 1.2.1 on Standalone cluster mode with 3 worker nodes and 1 master.
Can someone help me on this error I am getting when running my app in a spark cluster ?
Error: Could not find or load main class org.apache.spark.executor.CoarseGrainedExecutorBackend
Command on the worker node is
Spark Executor Command: "java" "-cp" ":/opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Xms49152M" "-Xmx49152M" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "akka://spark@poc1:54483/user/CoarseGrainedScheduler" "2" "poc3" "16"
I checked logs on spark master as well spark workers but not much info except above error.
Thanks,
Hussam
RE: Error: Could not find or load main class
org.apache.spark.executor.CoarseGrainedExecutorBackend
Posted by Hu...@Dell.com.
I found the issue which was due to my app looking for wrong spark jar.
Thanks,
Hussam
From: Tathagata Das [mailto:tathagata.das1565@gmail.com]
Sent: Monday, January 20, 2014 6:17 PM
To: user@spark.incubator.apache.org
Subject: Re: Error: Could not find or load main class org.apache.spark.executor.CoarseGrainedExecutorBackend
Hi Hussam,
Have you (1) generated Spark jar using sbt/sbt assembl, (2) distributed the Spark jar to the worker machines? It could be that the system expects that Spark jar to be present in /opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar in one of the worker machines, but its not finding the jar and hence not finding the necessary class. Can you double-check whether the jar exists in that location in all the worker nodes?
TD
On Mon, Jan 20, 2014 at 4:44 PM, <Hu...@dell.com>> wrote:
Hi,
I am using spark 0.8.0 when hadoop 1.2.1 on Standalone cluster mode with 3 worker nodes and 1 master.
Can someone help me on this error I am getting when running my app in a spark cluster ?
Error: Could not find or load main class org.apache.spark.executor.CoarseGrainedExecutorBackend
Command on the worker node is
Spark Executor Command: "java" "-cp" ":/opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Xms49152M" "-Xmx49152M" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "akka://spark@poc1:54483/user/CoarseGrainedScheduler" "2" "poc3" "16"
I checked logs on spark master as well spark workers but not much info except above error.
Thanks,
Hussam
Re: Error: Could not find or load main class org.apache.spark.executor.CoarseGrainedExecutorBackend
Posted by Tathagata Das <ta...@gmail.com>.
Hi Hussam,
Have you (1) generated Spark jar using sbt/sbt assembl, (2) distributed the
Spark jar to the worker machines? It could be that the system expects that
Spark jar to be present in /opt/spark-0.8.0/conf:/opt/
spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.
9.3-0.8.0-incubating-hadoop1.0.4.jar in one of the worker machines, but its
not finding the jar and hence not finding the necessary class. Can you
double-check whether the jar exists in that location in all the worker
nodes?
TD
On Mon, Jan 20, 2014 at 4:44 PM, <Hu...@dell.com> wrote:
> Hi,
>
>
>
> I am using spark 0.8.0 when hadoop 1.2.1 on Standalone cluster mode with 3
> worker nodes and 1 master.
>
>
>
> Can someone help me on this error I am getting when running my app in a
> spark cluster ?
>
> Error: Could not find or load main class
> org.apache.spark.executor.CoarseGrainedExecutorBackend
>
>
>
> Command on the worker node is
>
> Spark Executor Command: "java" "-cp" ":/opt/spark-0.8.0/conf:/opt/spark-0.8.0/assembly/target/scala-2.9.3/spark-assembly_2.9.3-0.8.0-incubating-hadoop1.0.4.jar" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Dspark.local.dir=/home/hadoop/spark" "-Xms49152M" "-Xmx49152M" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "akka://spark@poc1:54483/user/CoarseGrainedScheduler" "2" "poc3" "16"
>
>
>
> I checked logs on spark master as well spark workers but not much info
> except above error.
>
>
>
> Thanks,
>
> Hussam
>