You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mesos.apache.org by Rad Gruchalski <ra...@gruchalski.com> on 2015/09/16 22:20:53 UTC
Mesos 0.24.0 with spark in docker - error
Dear list,
I’m here for the first time so I do apologise if I’m misbehaving. I am currently trying running Scala Spark Notebook on Mesos with Docker in bridge networking. I would like to launch spark executors using the cluster, the master is running on. This requires registering a framework from inside of the container and being able to accept the offers back inside. I found out that the necessary settings, LIBPROCESS_ADVERTISE_IP and LIBPROCESS_ADVERTISE_PORT, were added in mesos 0.24.0, which seems to be released (at least tagged in git). I have the cluster running mesos 0.24.0 and this seems to be behaving really good so far.
However, I’m having a problem with the Spark Notebook docker image.
The way I build mesos itself, I use mesosphere/mesos-deb-packaging:
git clone https://github.com/mesosphere/mesos-deb-packaging.git .
git checkout d7e5b7b5a8a04b11eaee6a1f9a0962ef3e77864a
And I’m applying this patch (mesos 0.24.0):
diff --git a/build_mesos b/build_mesos
index 81561bc..f756ef0 100755
--- a/build_mesos
+++ b/build_mesos
@@ -313,9 +313,10 @@ function deb_ {
--deb-recommends zookeeperd
--deb-recommends zookeeper-bin
-d 'java-runtime-headless'
- -d libcurl3
- -d libsvn1
- -d libsasl2-modules
+ -d libcurl4-nss-dev
+ -d libsasl2-dev
+ -d libapr1-dev
+ -d libsvn-dev
--after-install "$this/$asset_dir/mesos.postinst"
--after-remove "$this/$asset_dir/mesos.postrm" )
rm -f "$this”/pkg.deb
My mesos setup
git clone https://github.com/apache/mesos.git .
git fetch origin
git checkout 0.24.0
Having gcc-4.9.2, oracle-java7-installer and the following installed (ubuntu 12.04):
apt-get install -y autoconf libtool
apt-get install -y python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev git-core software-properties-common \
python-software-properties ruby1.9.1 ruby1.9.1-dev \
build-essential libxslt-dev libxml2-dev \
wget zlibc zlib1g zlib1g-dev
gem install fpm --no-ri --no-rdoc
Building mesos with command:
mkdir -p versions/0.24.0
cd versions/0.24.0
git clone https://github.com/apache/mesos.git .
git fetch origin
git checkout 0.24.0
cd ../..
./build_mesos --src-dir versions/0.24.0
Gives me a DEB package which can be successfully installed on a clean ubuntu:14.04 docker image by installing only these:
apt-get install -y openjdk-7-jre build-essential python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev
I build on ubuntu:12.04. The solution needs to run on 12.04.
This is how I build the notebook, there are 2 images, the first one tagged as myorg/openjdk-7-jre-trusty, the second generated by sbt, however the most important commands are listed.
# myorg/openjdk-7-jre-trusty
FROM ubuntu:trusty
MAINTAINER ...
RUN \
apt-get update && \
apt-get install -y openjdk-7-jre && \
rm -rf /var/lib/apt/lists/*
ENV JAVA_HOME /usr/lib/jvm/java-7-openjdk-amd64
The notebook itself:
FROM myorg/openjdk-7-jre-trusty
MAINTAINER …
RUN apt-get update -y && apt-get install -y wget build-essential python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev
RUN cd /tmp && wget https://to.my/location/where/I/store/built/mesos-0.24.0.deb && dpkg -i mesos-0.24.0.deb
ENV MESOS_JAVA_NATIVE_LIBRARY /usr/lib/libmesos.so
ENV MESOS_LOG_DIR /var/log/mesos
I attempt running this code in Spark Notebook:
# all the hosts used are resolvable, files can be accessed on HDFS:
reset("spark-notebook-tests", lastChanges = (c:SparkConf) => {
c.set("spark.mesos.executor.uri", "hdfs:///resources/spark-1.4.1-cdh4.tar.gz")
.set("spark.master", "mesos://zk://zookeeper-0:2181,zookeeper-1:2181,zookeeper-2:2181/mesos")
.set("spark.local.dir", System.getenv("MESOS_SANDBOX"))
})
Spark notebook launched with this command:
export LIBPROCESS_PORT=10000; export LIBPROCESS_ADVERTISE_IP=10.XXX.XXX.XXX; export LIBPROCESS_ADVERTISE_PORT=$PORT1; /opt/docker/bin/spark-notebook
The result should be: at least a trace in logs that the offers can’t be sent and framework registration attempt. Instead, I receive an error, listed below:
java.lang.UnsatisfiedLinkError: /usr/lib/libmesos-0.24.0.so: /usr/lib/x86_64-linux-gnu/libcurl-nss.so.4: version `CURL_3' not found (required by /usr/lib/libmesos-0.24.0.so)
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1880)
at java.lang.Runtime.loadLibrary0(Runtime.java:849)
at java.lang.System.loadLibrary(System.java:1088)
at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:54)
at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:79)
at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2537)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:489)
at $iwC$$iwC$$anon$1.reset(<console>:71)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:61)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:63)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:65)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:67)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
at $iwC$$iwC$$iwC.<init>(<console>:89)
at $iwC$$iwC.<init>(<console>:91)
at $iwC.<init>(<console>:93)
at <init>(<console>:95)
at .<init>(<console>:99)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.Console$.withOut(Console.scala:126)
at notebook.kernel.Repl.evaluate(Repl.scala:172)
at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:348)
at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:345)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
I am not sure if I understand the dependencies wrong or there’s another issue. I have seen this issue mentioned twice in google cache of IRC channels. No solution though. What could be possibly wrong here?
Any pointers appreciated.
Kind regards,
Radek Gruchalski
radek@gruchalski.com (mailto:radek@gruchalski.com)
(mailto:radek@gruchalski.com)
de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/)
Confidentiality:
This communication is intended for the above-named person and may be confidential and/or legally privileged.
If it has come to you in error you must take no action based on it, nor must you copy or show it to anyone; please delete/destroy and inform the sender immediately.
Re: Mesos 0.24.0 with spark in docker - error
Posted by Rad Gruchalski <ra...@gruchalski.com>.
Marco,
The same setup with mesos 0.23.0 attempts registering the framework but obviously fails as the framework request has an incorrect address (docker in bridge mode).
Kind regards,
Radek Gruchalski
radek@gruchalski.com (mailto:radek@gruchalski.com)
(mailto:radek@gruchalski.com)
de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/)
Confidentiality:
This communication is intended for the above-named person and may be confidential and/or legally privileged.
If it has come to you in error you must take no action based on it, nor must you copy or show it to anyone; please delete/destroy and inform the sender immediately.
On Wednesday, 16 September 2015 at 22:58, Marco Massenzio wrote:
> Radek:
>
> I'm afraid I have some difficulty in following all your steps, so won't be able to comment on all that.
> However, last I tried to run a Spark shell (both Python and Scala) against a deployed Spark cluster (I was using DCOS CE) that did not work.
>
> AFAIK running a Spark shell is not supported (but others may have succeeded in doing so, please chime in) - did you try to run a spark-submit job and see if you could get that one running?
>
> As per your error:
>
> > java.lang.UnsatisfiedLinkError: /usr/lib/libmesos-0.24.0.so (http://libmesos-0.24.0.so/): /usr/lib/x86_64-linux-gnu/libcurl-nss.so.4: version `CURL_3' not found (required by /usr/lib/libmesos-0.24.0.so (http://libmesos-0.24.0.so/))
>
> You are right: it's clearly a missing dependency in libcurl (FWIW, I don't think Mesos supports 12.04).
>
> Marco Massenzio
> Distributed Systems Engineer
> http://codetrips.com
> On Wed, Sep 16, 2015 at 1:20 PM, Rad Gruchalski <radek@gruchalski.com (mailto:radek@gruchalski.com)> wrote:
> > Dear list,
> >
> > I’m here for the first time so I do apologise if I’m misbehaving. I am currently trying running Scala Spark Notebook on Mesos with Docker in bridge networking. I would like to launch spark executors using the cluster, the master is running on. This requires registering a framework from inside of the container and being able to accept the offers back inside. I found out that the necessary settings, LIBPROCESS_ADVERTISE_IP and LIBPROCESS_ADVERTISE_PORT, were added in mesos 0.24.0, which seems to be released (at least tagged in git). I have the cluster running mesos 0.24.0 and this seems to be behaving really good so far.
> > However, I’m having a problem with the Spark Notebook docker image.
> >
> > The way I build mesos itself, I use mesosphere/mesos-deb-packaging:
> >
> > git clone https://github.com/mesosphere/mesos-deb-packaging.git .
> > git checkout d7e5b7b5a8a04b11eaee6a1f9a0962ef3e77864a
> >
> >
> > And I’m applying this patch (mesos 0.24.0):
> >
> > diff --git a/build_mesos b/build_mesos
> > index 81561bc..f756ef0 100755
> > --- a/build_mesos
> > +++ b/build_mesos
> >
> > @@ -313,9 +313,10 @@ function deb_ {
> > --deb-recommends zookeeperd
> > --deb-recommends zookeeper-bin
> > -d 'java-runtime-headless'
> > - -d libcurl3
> > - -d libsvn1
> > - -d libsasl2-modules
> > + -d libcurl4-nss-dev
> > + -d libsasl2-dev
> > + -d libapr1-dev
> > + -d libsvn-dev
> > --after-install "$this/$asset_dir/mesos.postinst"
> > --after-remove "$this/$asset_dir/mesos.postrm" )
> > rm -f "$this”/pkg.deb
> >
> >
> > My mesos setup
> >
> > git clone https://github.com/apache/mesos.git .
> > git fetch origin
> > git checkout 0.24.0
> >
> >
> > Having gcc-4.9.2, oracle-java7-installer and the following installed (ubuntu 12.04):
> >
> > apt-get install -y autoconf libtool
> > apt-get install -y python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev git-core software-properties-common \
> > python-software-properties ruby1.9.1 ruby1.9.1-dev \
> > build-essential libxslt-dev libxml2-dev \
> > wget zlibc zlib1g zlib1g-dev
> >
> > gem install fpm --no-ri --no-rdoc
> >
> > Building mesos with command:
> >
> > mkdir -p versions/0.24.0
> > cd versions/0.24.0
> > git clone https://github.com/apache/mesos.git .
> > git fetch origin
> > git checkout 0.24.0
> > cd ../..
> > ./build_mesos --src-dir versions/0.24.0
> >
> > Gives me a DEB package which can be successfully installed on a clean ubuntu:14.04 docker image by installing only these:
> >
> > apt-get install -y openjdk-7-jre build-essential python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev
> >
> > I build on ubuntu:12.04. The solution needs to run on 12.04.
> >
> > This is how I build the notebook, there are 2 images, the first one tagged as myorg/openjdk-7-jre-trusty, the second generated by sbt, however the most important commands are listed.
> >
> > # myorg/openjdk-7-jre-trusty
> > FROM ubuntu:trusty
> > MAINTAINER ...
> > RUN \
> > apt-get update && \
> > apt-get install -y openjdk-7-jre && \
> > rm -rf /var/lib/apt/lists/*
> > ENV JAVA_HOME /usr/lib/jvm/java-7-openjdk-amd64
> >
> >
> > The notebook itself:
> >
> > FROM myorg/openjdk-7-jre-trusty
> > MAINTAINER …
> > RUN apt-get update -y && apt-get install -y wget build-essential python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev
> > RUN cd /tmp && wget https://to.my/location/where/I/store/built/mesos-0.24.0.deb && dpkg -i mesos-0.24.0.deb
> > ENV MESOS_JAVA_NATIVE_LIBRARY /usr/lib/libmesos.so
> > ENV MESOS_LOG_DIR /var/log/mesos
> >
> > I attempt running this code in Spark Notebook:
> >
> > # all the hosts used are resolvable, files can be accessed on HDFS:
> > reset("spark-notebook-tests", lastChanges = (c:SparkConf) => {
> > c.set("spark.mesos.executor.uri", "hdfs:///resources/spark-1.4.1-cdh4.tar.gz")
> > .set("spark.master", "mesos://zk://zookeeper-0:2181,zookeeper-1:2181,zookeeper-2:2181/mesos")
> > .set("spark.local.dir", System.getenv("MESOS_SANDBOX"))
> > })
> >
> >
> > Spark notebook launched with this command:
> >
> > export LIBPROCESS_PORT=10000; export LIBPROCESS_ADVERTISE_IP=10.XXX.XXX.XXX; export LIBPROCESS_ADVERTISE_PORT=$PORT1; /opt/docker/bin/spark-notebook
> >
> > The result should be: at least a trace in logs that the offers can’t be sent and framework registration attempt. Instead, I receive an error, listed below:
> >
> > java.lang.UnsatisfiedLinkError: /usr/lib/libmesos-0.24.0.so (http://libmesos-0.24.0.so): /usr/lib/x86_64-linux-gnu/libcurl-nss.so.4: version `CURL_3' not found (required by /usr/lib/libmesos-0.24.0.so (http://libmesos-0.24.0.so))
> > at java.lang.ClassLoader$NativeLibrary.load(Native Method)
> > at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
> > at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
> > at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1880)
> > at java.lang.Runtime.loadLibrary0(Runtime.java:849)
> > at java.lang.System.loadLibrary(System.java:1088)
> > at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:54)
> > at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:79)
> > at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2537)
> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:489)
> > at $iwC$$iwC$$anon$1.reset(<console>:71)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:606)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:61)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:63)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:65)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:67)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
> > at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
> > at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
> > at $iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
> > at $iwC$$iwC$$iwC.<init>(<console>:89)
> > at $iwC$$iwC.<init>(<console>:91)
> > at $iwC.<init>(<console>:93)
> > at <init>(<console>:95)
> > at .<init>(<console>:99)
> > at .<clinit>(<console>)
> > at .<init>(<console>:7)
> > at .<clinit>(<console>)
> > at $print(<console>)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:606)
> > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
> > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> > at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
> > at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
> > at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> > at scala.Console$.withOut(Console.scala:126)
> > at notebook.kernel.Repl.evaluate(Repl.scala:172)
> > at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:348)
> > at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:345)
> > at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> > at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> > at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
> > at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
> > at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> > at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> > at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> > at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> >
> >
> > I am not sure if I understand the dependencies wrong or there’s another issue. I have seen this issue mentioned twice in google cache of IRC channels. No solution though. What could be possibly wrong here?
> > Any pointers appreciated.
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > Kind regards,
> > Radek Gruchalski
> >
radek@gruchalski.com (mailto:radek@gruchalski.com)
(mailto:radek@gruchalski.com)
> > de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/)
> >
> > Confidentiality:
> > This communication is intended for the above-named person and may be confidential and/or legally privileged.
> > If it has come to you in error you must take no action based on it, nor must you copy or show it to anyone; please delete/destroy and inform the sender immediately.
> >
> >
> >
>
>
>
Re: Mesos 0.24.0 with spark in docker - error
Posted by Marco Massenzio <ma...@mesosphere.io>.
Radek:
I'm afraid I have some difficulty in following all your steps, so won't be
able to comment on all that.
However, last I tried to run a Spark shell (both Python and Scala) against
a deployed Spark cluster (I was using DCOS CE) that did not work.
AFAIK running a Spark shell is not supported (but others may have succeeded
in doing so, please chime in) - did you try to run a spark-submit job and
see if you could get that one running?
As per your error:
java.lang.UnsatisfiedLinkError: /usr/lib/libmesos-0.24.0.so:
> /usr/lib/x86_64-linux-gnu/libcurl-nss.so.4: version `CURL_3' not found
> (required by /usr/lib/libmesos-0.24.0.so)
You are right: it's clearly a missing dependency in libcurl (FWIW, I don't
think Mesos supports 12.04).
*Marco Massenzio*
*Distributed Systems Engineerhttp://codetrips.com <http://codetrips.com>*
On Wed, Sep 16, 2015 at 1:20 PM, Rad Gruchalski <ra...@gruchalski.com>
wrote:
> Dear list,
>
> I’m here for the first time so I do apologise if I’m misbehaving. I am
> currently trying running Scala Spark Notebook on Mesos with Docker in
> bridge networking. I would like to launch spark executors using the
> cluster, the master is running on. This requires registering a framework
> from inside of the container and being able to accept the offers back
> inside. I found out that the necessary settings, LIBPROCESS_ADVERTISE_IP
> and LIBPROCESS_ADVERTISE_PORT, were added in mesos 0.24.0, which seems to
> be released (at least tagged in git). I have the cluster running mesos
> 0.24.0 and this seems to be behaving really good so far.
> However, I’m having a problem with the Spark Notebook docker image.
>
> The way I build mesos itself, I use mesosphere/mesos-deb-packaging:
>
> git clone https://github.com/mesosphere/mesos-deb-packaging.git .
> git checkout d7e5b7b5a8a04b11eaee6a1f9a0962ef3e77864a
>
> And I’m applying this patch (mesos 0.24.0):
>
> diff --git a/build_mesos b/build_mesos
> index 81561bc..f756ef0 100755
> --- a/build_mesos
> +++ b/build_mesos
> @@ -313,9 +313,10 @@ function deb_ {
> --deb-recommends zookeeperd
> --deb-recommends zookeeper-bin
> -d 'java-runtime-headless'
> - -d libcurl3
> - -d libsvn1
> - -d libsasl2-modules
> + -d libcurl4-nss-dev
> + -d libsasl2-dev
> + -d libapr1-dev
> + -d libsvn-dev
> --after-install "$this/$asset_dir/mesos.postinst"
> --after-remove "$this/$asset_dir/mesos.postrm" )
> rm -f "$this”/pkg.deb
>
> My mesos setup
>
> git clone https://github.com/apache/mesos.git .
> git fetch origin
> git checkout 0.24.0
>
> Having gcc-4.9.2, oracle-java7-installer and the following installed
> (ubuntu 12.04):
>
> apt-get install -y autoconf libtool
> apt-get install -y python-dev python-boto libcurl4-nss-dev
> libsasl2-dev maven libapr1-dev libsvn-dev git-core
> software-properties-common \
> python-software-properties ruby1.9.1
> ruby1.9.1-dev \
> build-essential libxslt-dev libxml2-dev \
> wget zlibc zlib1g zlib1g-dev
> gem install fpm --no-ri --no-rdoc
>
> Building mesos with command:
>
> mkdir -p versions/0.24.0
> cd versions/0.24.0
> git clone https://github.com/apache/mesos.git .
> git fetch origin
> git checkout 0.24.0
> cd ../..
> ./build_mesos --src-dir versions/0.24.0
>
> Gives me a DEB package which can be successfully installed on a clean
> ubuntu:14.04 docker image by installing only these:
>
> apt-get install -y openjdk-7-jre build-essential python-dev
> python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev
>
> I build on ubuntu:12.04. The solution needs to run on 12.04.
>
> This is how I build the notebook, there are 2 images, the first one tagged
> as myorg/openjdk-7-jre-trusty, the second generated by sbt, however the
> most important commands are listed.
>
> # myorg/openjdk-7-jre-trusty
> FROM ubuntu:trusty
> MAINTAINER ...
> RUN \
> apt-get update && \
> apt-get install -y openjdk-7-jre && \
> rm -rf /var/lib/apt/lists/*
> ENV JAVA_HOME /usr/lib/jvm/java-7-openjdk-amd64
>
> The notebook itself:
>
> FROM myorg/openjdk-7-jre-trusty
> MAINTAINER …
> RUN apt-get update -y && apt-get install -y wget build-essential
> python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev
> libsvn-dev
> RUN cd /tmp && wget
> https://to.my/location/where/I/store/built/mesos-0.24.0.deb && dpkg -i
> mesos-0.24.0.deb
> ENV MESOS_JAVA_NATIVE_LIBRARY /usr/lib/libmesos.so
> ENV MESOS_LOG_DIR /var/log/mesos
>
> I attempt running this code in Spark Notebook:
>
> # all the hosts used are resolvable, files can be accessed on HDFS:
> reset("spark-notebook-tests", lastChanges = (c:SparkConf) => {
> c.set("spark.mesos.executor.uri",
> "hdfs:///resources/spark-1.4.1-cdh4.tar.gz")
> .set("spark.master",
> "mesos://zk://zookeeper-0:2181,zookeeper-1:2181,zookeeper-2:2181/mesos")
> .set("spark.local.dir", System.getenv("MESOS_SANDBOX"))
> })
>
> Spark notebook launched with this command:
>
> export LIBPROCESS_PORT=10000; export
> LIBPROCESS_ADVERTISE_IP=10.XXX.XXX.XXX; export
> LIBPROCESS_ADVERTISE_PORT=$PORT1; /opt/docker/bin/spark-notebook
>
> The result should be: at least a trace in logs that the offers can’t be
> sent and framework registration attempt. Instead, I receive an error,
> listed below:
>
> java.lang.UnsatisfiedLinkError: /usr/lib/libmesos-0.24.0.so:
> /usr/lib/x86_64-linux-gnu/libcurl-nss.so.4: version `CURL_3' not found
> (required by /usr/lib/libmesos-0.24.0.so)
> at java.lang.ClassLoader$NativeLibrary.load(Native Method)
> at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
> at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1880)
> at java.lang.Runtime.loadLibrary0(Runtime.java:849)
> at java.lang.System.loadLibrary(System.java:1088)
> at
> org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:54)
> at
> org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:79)
> at
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2537)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:489)
> at $iwC$$iwC$$anon$1.reset(<console>:71)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:61)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:63)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:65)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:67)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
> at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
> at $iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
> at $iwC$$iwC$$iwC.<init>(<console>:89)
> at $iwC$$iwC.<init>(<console>:91)
> at $iwC.<init>(<console>:93)
> at <init>(<console>:95)
> at .<init>(<console>:99)
> at .<clinit>(<console>)
> at .<init>(<console>:7)
> at .<clinit>(<console>)
> at $print(<console>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
> at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
> at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at scala.Console$.withOut(Console.scala:126)
> at notebook.kernel.Repl.evaluate(Repl.scala:172)
> at
> notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:348)
> at
> notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:345)
> at
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> I am not sure if I understand the dependencies wrong or there’s another
> issue. I have seen this issue mentioned twice in google cache of IRC
> channels. No solution though. What could be possibly wrong here?
> Any pointers appreciated.
>
> Kind regards,
> Radek Gruchalski
> radek@gruchalski.com <ra...@gruchalski.com>
> de.linkedin.com/in/radgruchalski/
>
>
> *Confidentiality:*This communication is intended for the above-named
> person and may be confidential and/or legally privileged.
> If it has come to you in error you must take no action based on it, nor
> must you copy or show it to anyone; please delete/destroy and inform the
> sender immediately.
>
Re: Mesos 0.24.0 with spark in docker - error
Posted by tommy xiao <xi...@gmail.com>.
OK. Rad, i don't think this is caused by docker compatible.
2015-09-17 15:54 GMT+08:00 Rad Gruchalski <ra...@gruchalski.com>:
> I found a fix. It’s rather obvious: build on the same system you’re going
> to deploy to.
> Scala Spark Notebook built on 14.04 and running on 14.04 works without any
> issues.
>
> Kind regards,
> Radek Gruchalski
> radek@gruchalski.com <ra...@gruchalski.com>
> de.linkedin.com/in/radgruchalski/
>
>
> *Confidentiality:*This communication is intended for the above-named
> person and may be confidential and/or legally privileged.
> If it has come to you in error you must take no action based on it, nor
> must you copy or show it to anyone; please delete/destroy and inform the
> sender immediately.
>
> On Wednesday, 16 September 2015 at 22:20, Rad Gruchalski wrote:
>
> Dear list,
>
> I’m here for the first time so I do apologise if I’m misbehaving. I am
> currently trying running Scala Spark Notebook on Mesos with Docker in
> bridge networking. I would like to launch spark executors using the
> cluster, the master is running on. This requires registering a framework
> from inside of the container and being able to accept the offers back
> inside. I found out that the necessary settings, LIBPROCESS_ADVERTISE_IP
> and LIBPROCESS_ADVERTISE_PORT, were added in mesos 0.24.0, which seems to
> be released (at least tagged in git). I have the cluster running mesos
> 0.24.0 and this seems to be behaving really good so far.
> However, I’m having a problem with the Spark Notebook docker image.
>
> The way I build mesos itself, I use mesosphere/mesos-deb-packaging:
>
> git clone https://github.com/mesosphere/mesos-deb-packaging.git .
> git checkout d7e5b7b5a8a04b11eaee6a1f9a0962ef3e77864a
>
> And I’m applying this patch (mesos 0.24.0):
>
> diff --git a/build_mesos b/build_mesos
> index 81561bc..f756ef0 100755
> --- a/build_mesos
> +++ b/build_mesos
> @@ -313,9 +313,10 @@ function deb_ {
> --deb-recommends zookeeperd
> --deb-recommends zookeeper-bin
> -d 'java-runtime-headless'
> - -d libcurl3
> - -d libsvn1
> - -d libsasl2-modules
> + -d libcurl4-nss-dev
> + -d libsasl2-dev
> + -d libapr1-dev
> + -d libsvn-dev
> --after-install "$this/$asset_dir/mesos.postinst"
> --after-remove "$this/$asset_dir/mesos.postrm" )
> rm -f "$this”/pkg.deb
>
> My mesos setup
>
> git clone https://github.com/apache/mesos.git .
> git fetch origin
> git checkout 0.24.0
>
> Having gcc-4.9.2, oracle-java7-installer and the following installed
> (ubuntu 12.04):
>
> apt-get install -y autoconf libtool
> apt-get install -y python-dev python-boto libcurl4-nss-dev
> libsasl2-dev maven libapr1-dev libsvn-dev git-core
> software-properties-common \
> python-software-properties ruby1.9.1
> ruby1.9.1-dev \
> build-essential libxslt-dev libxml2-dev \
> wget zlibc zlib1g zlib1g-dev
> gem install fpm --no-ri --no-rdoc
>
> Building mesos with command:
>
> mkdir -p versions/0.24.0
> cd versions/0.24.0
> git clone https://github.com/apache/mesos.git .
> git fetch origin
> git checkout 0.24.0
> cd ../..
> ./build_mesos --src-dir versions/0.24.0
>
> Gives me a DEB package which can be successfully installed on a clean
> ubuntu:14.04 docker image by installing only these:
>
> apt-get install -y openjdk-7-jre build-essential python-dev
> python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev
>
> I build on ubuntu:12.04. The solution needs to run on 12.04.
>
> This is how I build the notebook, there are 2 images, the first one tagged
> as myorg/openjdk-7-jre-trusty, the second generated by sbt, however the
> most important commands are listed.
>
> # myorg/openjdk-7-jre-trusty
> FROM ubuntu:trusty
> MAINTAINER ...
> RUN \
> apt-get update && \
> apt-get install -y openjdk-7-jre && \
> rm -rf /var/lib/apt/lists/*
> ENV JAVA_HOME /usr/lib/jvm/java-7-openjdk-amd64
>
> The notebook itself:
>
> FROM myorg/openjdk-7-jre-trusty
> MAINTAINER …
> RUN apt-get update -y && apt-get install -y wget build-essential
> python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev
> libsvn-dev
> RUN cd /tmp && wget
> https://to.my/location/where/I/store/built/mesos-0.24.0.deb && dpkg -i
> mesos-0.24.0.deb
> ENV MESOS_JAVA_NATIVE_LIBRARY /usr/lib/libmesos.so
> ENV MESOS_LOG_DIR /var/log/mesos
>
> I attempt running this code in Spark Notebook:
>
> # all the hosts used are resolvable, files can be accessed on HDFS:
> reset("spark-notebook-tests", lastChanges = (c:SparkConf) => {
> c.set("spark.mesos.executor.uri",
> "hdfs:///resources/spark-1.4.1-cdh4.tar.gz")
> .set("spark.master",
> "mesos://zk://zookeeper-0:2181,zookeeper-1:2181,zookeeper-2:2181/mesos")
> .set("spark.local.dir", System.getenv("MESOS_SANDBOX"))
> })
>
> Spark notebook launched with this command:
>
> export LIBPROCESS_PORT=10000; export
> LIBPROCESS_ADVERTISE_IP=10.XXX.XXX.XXX; export
> LIBPROCESS_ADVERTISE_PORT=$PORT1; /opt/docker/bin/spark-notebook
>
> The result should be: at least a trace in logs that the offers can’t be
> sent and framework registration attempt. Instead, I receive an error,
> listed below:
>
> java.lang.UnsatisfiedLinkError: /usr/lib/libmesos-0.24.0.so:
> /usr/lib/x86_64-linux-gnu/libcurl-nss.so.4: version `CURL_3' not found
> (required by /usr/lib/libmesos-0.24.0.so)
> at java.lang.ClassLoader$NativeLibrary.load(Native Method)
> at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
> at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1880)
> at java.lang.Runtime.loadLibrary0(Runtime.java:849)
> at java.lang.System.loadLibrary(System.java:1088)
> at
> org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:54)
> at
> org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:79)
> at
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2537)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:489)
> at $iwC$$iwC$$anon$1.reset(<console>:71)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:61)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:63)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:65)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:67)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
> at
> $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
> at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
> at $iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
> at $iwC$$iwC$$iwC.<init>(<console>:89)
> at $iwC$$iwC.<init>(<console>:91)
> at $iwC.<init>(<console>:93)
> at <init>(<console>:95)
> at .<init>(<console>:99)
> at .<clinit>(<console>)
> at .<init>(<console>:7)
> at .<clinit>(<console>)
> at $print(<console>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
> at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
> at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at scala.Console$.withOut(Console.scala:126)
> at notebook.kernel.Repl.evaluate(Repl.scala:172)
> at
> notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:348)
> at
> notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:345)
> at
> scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at
> scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
> at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
> at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
> I am not sure if I understand the dependencies wrong or there’s another
> issue. I have seen this issue mentioned twice in google cache of IRC
> channels. No solution though. What could be possibly wrong here?
> Any pointers appreciated.
>
> Kind regards,
> Radek Gruchalski
> radek@gruchalski.com <ra...@gruchalski.com>
> de.linkedin.com/in/radgruchalski/
>
>
> *Confidentiality:*This communication is intended for the above-named
> person and may be confidential and/or legally privileged.
> If it has come to you in error you must take no action based on it, nor
> must you copy or show it to anyone; please delete/destroy and inform the
> sender immediately.
>
>
>
--
Deshi Xiao
Twitter: xds2000
E-mail: xiaods(AT)gmail.com
Re: Mesos 0.24.0 with spark in docker - error
Posted by Rad Gruchalski <ra...@gruchalski.com>.
I found a fix. It’s rather obvious: build on the same system you’re going to deploy to.
Scala Spark Notebook built on 14.04 and running on 14.04 works without any issues.
Kind regards,
Radek Gruchalski
radek@gruchalski.com (mailto:radek@gruchalski.com)
(mailto:radek@gruchalski.com)
de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/)
Confidentiality:
This communication is intended for the above-named person and may be confidential and/or legally privileged.
If it has come to you in error you must take no action based on it, nor must you copy or show it to anyone; please delete/destroy and inform the sender immediately.
On Wednesday, 16 September 2015 at 22:20, Rad Gruchalski wrote:
> Dear list,
>
> I’m here for the first time so I do apologise if I’m misbehaving. I am currently trying running Scala Spark Notebook on Mesos with Docker in bridge networking. I would like to launch spark executors using the cluster, the master is running on. This requires registering a framework from inside of the container and being able to accept the offers back inside. I found out that the necessary settings, LIBPROCESS_ADVERTISE_IP and LIBPROCESS_ADVERTISE_PORT, were added in mesos 0.24.0, which seems to be released (at least tagged in git). I have the cluster running mesos 0.24.0 and this seems to be behaving really good so far.
> However, I’m having a problem with the Spark Notebook docker image.
>
> The way I build mesos itself, I use mesosphere/mesos-deb-packaging:
>
> git clone https://github.com/mesosphere/mesos-deb-packaging.git .
> git checkout d7e5b7b5a8a04b11eaee6a1f9a0962ef3e77864a
>
>
> And I’m applying this patch (mesos 0.24.0):
>
> diff --git a/build_mesos b/build_mesos
> index 81561bc..f756ef0 100755
> --- a/build_mesos
> +++ b/build_mesos
>
> @@ -313,9 +313,10 @@ function deb_ {
> --deb-recommends zookeeperd
> --deb-recommends zookeeper-bin
> -d 'java-runtime-headless'
> - -d libcurl3
> - -d libsvn1
> - -d libsasl2-modules
> + -d libcurl4-nss-dev
> + -d libsasl2-dev
> + -d libapr1-dev
> + -d libsvn-dev
> --after-install "$this/$asset_dir/mesos.postinst"
> --after-remove "$this/$asset_dir/mesos.postrm" )
> rm -f "$this”/pkg.deb
>
>
> My mesos setup
>
> git clone https://github.com/apache/mesos.git .
> git fetch origin
> git checkout 0.24.0
>
>
> Having gcc-4.9.2, oracle-java7-installer and the following installed (ubuntu 12.04):
>
> apt-get install -y autoconf libtool
> apt-get install -y python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev git-core software-properties-common \
> python-software-properties ruby1.9.1 ruby1.9.1-dev \
> build-essential libxslt-dev libxml2-dev \
> wget zlibc zlib1g zlib1g-dev
>
> gem install fpm --no-ri --no-rdoc
>
> Building mesos with command:
>
> mkdir -p versions/0.24.0
> cd versions/0.24.0
> git clone https://github.com/apache/mesos.git .
> git fetch origin
> git checkout 0.24.0
> cd ../..
> ./build_mesos --src-dir versions/0.24.0
>
> Gives me a DEB package which can be successfully installed on a clean ubuntu:14.04 docker image by installing only these:
>
> apt-get install -y openjdk-7-jre build-essential python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev
>
> I build on ubuntu:12.04. The solution needs to run on 12.04.
>
> This is how I build the notebook, there are 2 images, the first one tagged as myorg/openjdk-7-jre-trusty, the second generated by sbt, however the most important commands are listed.
>
> # myorg/openjdk-7-jre-trusty
> FROM ubuntu:trusty
> MAINTAINER ...
> RUN \
> apt-get update && \
> apt-get install -y openjdk-7-jre && \
> rm -rf /var/lib/apt/lists/*
> ENV JAVA_HOME /usr/lib/jvm/java-7-openjdk-amd64
>
>
> The notebook itself:
>
> FROM myorg/openjdk-7-jre-trusty
> MAINTAINER …
> RUN apt-get update -y && apt-get install -y wget build-essential python-dev python-boto libcurl4-nss-dev libsasl2-dev maven libapr1-dev libsvn-dev
> RUN cd /tmp && wget https://to.my/location/where/I/store/built/mesos-0.24.0.deb && dpkg -i mesos-0.24.0.deb
> ENV MESOS_JAVA_NATIVE_LIBRARY /usr/lib/libmesos.so
> ENV MESOS_LOG_DIR /var/log/mesos
>
> I attempt running this code in Spark Notebook:
>
> # all the hosts used are resolvable, files can be accessed on HDFS:
> reset("spark-notebook-tests", lastChanges = (c:SparkConf) => {
> c.set("spark.mesos.executor.uri", "hdfs:///resources/spark-1.4.1-cdh4.tar.gz")
> .set("spark.master", "mesos://zk://zookeeper-0:2181,zookeeper-1:2181,zookeeper-2:2181/mesos")
> .set("spark.local.dir", System.getenv("MESOS_SANDBOX"))
> })
>
>
> Spark notebook launched with this command:
>
> export LIBPROCESS_PORT=10000; export LIBPROCESS_ADVERTISE_IP=10.XXX.XXX.XXX; export LIBPROCESS_ADVERTISE_PORT=$PORT1; /opt/docker/bin/spark-notebook
>
> The result should be: at least a trace in logs that the offers can’t be sent and framework registration attempt. Instead, I receive an error, listed below:
>
> java.lang.UnsatisfiedLinkError: /usr/lib/libmesos-0.24.0.so: /usr/lib/x86_64-linux-gnu/libcurl-nss.so.4: version `CURL_3' not found (required by /usr/lib/libmesos-0.24.0.so)
> at java.lang.ClassLoader$NativeLibrary.load(Native Method)
> at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
> at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1880)
> at java.lang.Runtime.loadLibrary0(Runtime.java:849)
> at java.lang.System.loadLibrary(System.java:1088)
> at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:54)
> at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:79)
> at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2537)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:489)
> at $iwC$$iwC$$anon$1.reset(<console>:71)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:61)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:63)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:65)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:67)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
> at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
> at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
> at $iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
> at $iwC$$iwC$$iwC.<init>(<console>:89)
> at $iwC$$iwC.<init>(<console>:91)
> at $iwC.<init>(<console>:93)
> at <init>(<console>:95)
> at .<init>(<console>:99)
> at .<clinit>(<console>)
> at .<init>(<console>:7)
> at .<clinit>(<console>)
> at $print(<console>)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
> at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
> at notebook.kernel.Repl$$anonfun$3.apply(Repl.scala:173)
> at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> at scala.Console$.withOut(Console.scala:126)
> at notebook.kernel.Repl.evaluate(Repl.scala:172)
> at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:348)
> at notebook.client.ReplCalculator$$anonfun$10$$anon$1$$anonfun$22.apply(ReplCalculator.scala:345)
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
> at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
> at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
>
> I am not sure if I understand the dependencies wrong or there’s another issue. I have seen this issue mentioned twice in google cache of IRC channels. No solution though. What could be possibly wrong here?
> Any pointers appreciated.
>
>
>
>
>
>
>
>
>
>
> Kind regards,
> Radek Gruchalski
>
radek@gruchalski.com (mailto:radek@gruchalski.com)
(mailto:radek@gruchalski.com)
> de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/)
>
> Confidentiality:
> This communication is intended for the above-named person and may be confidential and/or legally privileged.
> If it has come to you in error you must take no action based on it, nor must you copy or show it to anyone; please delete/destroy and inform the sender immediately.
>
>
>
>