You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Klaus Ma (JIRA)" <ji...@apache.org> on 2015/10/16 05:34:05 UTC

[jira] [Created] (SPARK-11143) SparkMesosDispatcher can not launch driver in docker

Klaus Ma created SPARK-11143:
--------------------------------

             Summary: SparkMesosDispatcher can not launch driver in docker
                 Key: SPARK-11143
                 URL: https://issues.apache.org/jira/browse/SPARK-11143
             Project: Spark
          Issue Type: Bug
          Components: Mesos
    Affects Versions: 1.5.1
         Environment: Ubuntu 14.04
            Reporter: Klaus Ma


I'm working on integration between Mesos & Spark. For now, I can start SlaveMesosDispatcher in a docker; and I like to also run Spark executor in Mesos docker. I do the following configuration for it, but I got an error; any suggestion?

Configuration:

Spark: conf/spark-defaults.conf

spark.mesos.executor.docker.image        ubuntu
spark.mesos.executor.docker.volumes      /usr/bin:/usr/bin,/usr/local/lib:/usr/local/lib,/usr/lib:/usr/lib,/lib:/lib,/home/test/workshop/spark:/root/spark
spark.mesos.executor.home                /root/spark
#spark.executorEnv.SPARK_HOME             /root/spark
spark.executorEnv.MESOS_NATIVE_LIBRARY   /usr/local/lib

NOTE: The spark are installed in /home/test/workshop/spark, and all dependencies are installed.

After submit SparkPi to the dispatcher, the driver job is started but failed. The error messes is:

I1015 11:10:29.488456 18697 exec.cpp:134] Version: 0.26.0
I1015 11:10:29.506619 18699 exec.cpp:208] Executor registered on slave b7e24114-7585-40bc-879b-6a1188cb65b6-S1
WARNING: Your kernel does not support swap limit capabilities, memory limited without swap.
/bin/sh: 1: ./bin/spark-submit: not found

Does any know how to map/set spark home in docker for this case?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org