You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lei Xu (JIRA)" <ji...@apache.org> on 2015/10/29 09:28:27 UTC

[jira] [Commented] (SPARK-6284) Support framework authentication and role in Mesos framework

    [ https://issues.apache.org/jira/browse/SPARK-6284?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14980038#comment-14980038 ] 

Lei Xu commented on SPARK-6284:
-------------------------------

join this thread.

We have the same problems.

Mesos 0.25.0 - 3 masters & 38 slaves, with roles:

* cpus(spark):16;
* port\(\*\):\[8000-32000\];mem\(\*\):126976
* other framework roles..

Spark 1.5.0 was deployed in "cluster" mode, with 3 scheduler properties:

* spark.mesos.secret=4bbd7991de80b50bbbc5e272e7f67aff
* spark.mesos.role=spark
* spark.mesos.principal=spark

and task failed with message:

state: TASK_ERROR message: "Task uses more resources cpus\(\*\):1; mem\(\*\):2048 than available mem\(\*\):126976; ports\(\*\):\[8000-32000\]; disk\(\*\):1.04152e+06; cpus(spark):16" slave_id { value: "69d525a7-3a63-4b4d-a269-5aa478f8c15f-S36" } timestamp: 1.446104736756061E9 source: SOURCE_MASTER reason: REASON_TASK_INVALID 

Could you please show some examples to describe how to specify a job run with "spark" role on Mesos ?

> Support framework authentication and role in Mesos framework
> ------------------------------------------------------------
>
>                 Key: SPARK-6284
>                 URL: https://issues.apache.org/jira/browse/SPARK-6284
>             Project: Spark
>          Issue Type: Improvement
>          Components: Mesos
>            Reporter: Timothy Chen
>            Assignee: Timothy Chen
>             Fix For: 1.5.0
>
>
> Support framework authentication and role in both Coarse grain and fine grain mode.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org