You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by fabrizio silvestri <fa...@isti.cnr.it> on 2013/08/14 23:47:46 UTC

SimpleJob.scala on a small cluster

Dear all,

I'm very new to Spark and I was trying to get acquainted with its basics
using the tests in the documents.

Let me state first that I'm able to run everything using spark-shell also
when I run it using a mesos master on a small cluster of 4 machines.

I've tried to compile and run the example contained in the "A Standalone
Job in Scala" section available here: "
http://spark-project.org/docs/latest/quick-start.html".

As long as it concerns to run the job locally, everything works.

I then modified the following line:

val sc = new SparkContext("local", "Simple Job", "$YOUR_SPARK_HOME",
List("target/scala-2.9.3/simple-project_2.9.3-1.0.jar"))

into

val sc = new SparkContext("mesos://xxx.yyy.zzz.it:5050[4]", "Simple Job",
"$YOUR_SPARK_HOME", List("target/scala-2.9.3/simple-project_2.9.3-1.0.jar"))

to run on the above mentioned cluster of four machines.

Here's the issue:

sbt run

[...]
13/08/14 23:45:09 INFO spark.SparkContext: Added JAR
target/scala-2.9.3/simple-project_2.9.3-1.0.jar at
http://aaa.bbb.ccc.ddd:47683/jars/simple-project_2.9.3-1.0.jar with
timestamp 1376516709537
Failed to load native Mesos library from
/home/spark/BINs/spark-0.7.3/lib/java
[error] (run-main) java.lang.UnsatisfiedLinkError: no mesos in
java.library.path
java.lang.UnsatisfiedLinkError: no mesos in java.library.path
 at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1878)
at java.lang.Runtime.loadLibrary0(Runtime.java:849)
 at java.lang.System.loadLibrary(System.java:1087)
at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:46)
 at spark.SparkContext.<init>(SparkContext.scala:170)
at SimpleJob$.main(SimpleJob.scala:8)
at SimpleJob.main(SimpleJob.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
13/08/14 23:45:09 INFO network.ConnectionManager: Selector thread was
interrupted!
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[error] {file:/home/spark/DEMOs/SimpleJob/}default-031d15/compile:run:
Nonzero exit code: 1
[error] Total time: 4 s, completed Aug 14, 2013 11:45:09 PM

I've looked around and apparently I cannot find any help.

Thanks in advance
f

Re: SimpleJob.scala on a small cluster

Posted by Josh Rosen <ro...@gmail.com>.
You probably need to set MESOS_NATIVE_LIBRARY in your shell before running
your job through sbt.  When using spark-shell MESOS_NATIVE_LIBRARY is set
in `spark-env.sh`, but that file is only read by Spark jobs that use
Spark's "run" script.  You may have to manually `source spark-env.sh`
before `sbt run` (or `export MESOS_NATIVE_LIBRARY=...` in your shell).


On Wed, Aug 14, 2013 at 2:47 PM, fabrizio silvestri <
fabrizio.silvestri@isti.cnr.it> wrote:

> Dear all,
>
> I'm very new to Spark and I was trying to get acquainted with its basics
> using the tests in the documents.
>
> Let me state first that I'm able to run everything using spark-shell also
> when I run it using a mesos master on a small cluster of 4 machines.
>
> I've tried to compile and run the example contained in the "A Standalone
> Job in Scala" section available here: "
> http://spark-project.org/docs/latest/quick-start.html".
>
> As long as it concerns to run the job locally, everything works.
>
> I then modified the following line:
>
> val sc = new SparkContext("local", "Simple Job", "$YOUR_SPARK_HOME",
> List("target/scala-2.9.3/simple-project_2.9.3-1.0.jar"))
>
> into
>
> val sc = new SparkContext("mesos://xxx.yyy.zzz.it:5050[4]", "Simple Job",
> "$YOUR_SPARK_HOME", List("target/scala-2.9.3/simple-project_2.9.3-1.0.jar"))
>
> to run on the above mentioned cluster of four machines.
>
> Here's the issue:
>
> sbt run
>
> [...]
> 13/08/14 23:45:09 INFO spark.SparkContext: Added JAR
> target/scala-2.9.3/simple-project_2.9.3-1.0.jar at
> http://aaa.bbb.ccc.ddd:47683/jars/simple-project_2.9.3-1.0.jar with
> timestamp 1376516709537
> Failed to load native Mesos library from
> /home/spark/BINs/spark-0.7.3/lib/java
> [error] (run-main) java.lang.UnsatisfiedLinkError: no mesos in
> java.library.path
> java.lang.UnsatisfiedLinkError: no mesos in java.library.path
>  at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1878)
> at java.lang.Runtime.loadLibrary0(Runtime.java:849)
>  at java.lang.System.loadLibrary(System.java:1087)
> at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:46)
>  at spark.SparkContext.<init>(SparkContext.scala:170)
> at SimpleJob$.main(SimpleJob.scala:8)
> at SimpleJob.main(SimpleJob.scala)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> 13/08/14 23:45:09 INFO network.ConnectionManager: Selector thread was
> interrupted!
> java.lang.RuntimeException: Nonzero exit code: 1
> at scala.sys.package$.error(package.scala:27)
> [error] {file:/home/spark/DEMOs/SimpleJob/}default-031d15/compile:run:
> Nonzero exit code: 1
> [error] Total time: 4 s, completed Aug 14, 2013 11:45:09 PM
>
> I've looked around and apparently I cannot find any help.
>
> Thanks in advance
> f
>