You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Eran Withana (JIRA)" <ji...@apache.org> on 2016/03/10 04:15:41 UTC

[jira] [Commented] (SPARK-12345) Mesos cluster mode is broken

    [ https://issues.apache.org/jira/browse/SPARK-12345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15188589#comment-15188589 ] 

Eran Withana commented on SPARK-12345:
--------------------------------------

is the resolution to this issue available in Spark 1.6.0 release? 

I just used Spark 1.6.0 and got the following error in mesos logs, when it tried to run the task

```
I0310 03:13:11.417009 131594 exec.cpp:132] Version: 0.23.1
I0310 03:13:11.419452 131601 exec.cpp:206] Executor registered on slave 20160223-000314-3439362570-5050-631-S0
sh: 1: /usr/spark-1.6.0-bin-hadoop2.6/bin/spark-class: not found
```

> Mesos cluster mode is broken
> ----------------------------
>
>                 Key: SPARK-12345
>                 URL: https://issues.apache.org/jira/browse/SPARK-12345
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 1.6.0
>            Reporter: Andrew Or
>            Assignee: Timothy Chen
>            Priority: Critical
>             Fix For: 1.6.0
>
>
> The same setup worked in 1.5.2 but is now failing for 1.6.0-RC2.
> The driver is confused about where SPARK_HOME is. It resolves `mesos.executor.uri` or `spark.mesos.executor.home` relative to the filesystem where the driver runs, which is wrong.
> {code}
> I1215 15:00:39.411212 28032 exec.cpp:134] Version: 0.25.0
> I1215 15:00:39.413512 28037 exec.cpp:208] Executor registered on slave 130bdc39-44e7-4256-8c22-602040d337f1-S1
> bin/spark-submit: line 27: /Users/dragos/workspace/Spark/dev/rc-tests/spark-1.6.0-bin-hadoop2.6/bin/spark-class: No such file or directory
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org