You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Neeraj Goel (JIRA)" <ji...@apache.org> on 2016/06/20 12:15:05 UTC

[jira] [Commented] (SPARK-16066) Right now we don't have provision to pass custom param to executor dockers(Something like spark.mesos.executor.docker.parameters).

    [ https://issues.apache.org/jira/browse/SPARK-16066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15339401#comment-15339401 ] 

Neeraj Goel commented on SPARK-16066:
-------------------------------------

The use case in which I required this: We are running spark on Mesos. On all mesos slaves we have a logs forwarder installed and we want the docker to use syslog log-driver. Also we want to set custom tags for each job so that we are able to filter each log line as per the job name. For that we want each docker to be launched with appropriate log-driver and log-opt parameters. Mesos provides this capability by allowing each framework to supply custom parameters in DockerInfo. But spark framework currently does not use this and does not provide an interface to supply this information. It would be great if we can have this capability. 

Just FYI, I have a patch as well ready for this if we think we should make this addition to spark.

> Right now we don't have provision to pass custom param to executor dockers(Something like spark.mesos.executor.docker.parameters).
> ----------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16066
>                 URL: https://issues.apache.org/jira/browse/SPARK-16066
>             Project: Spark
>          Issue Type: Improvement
>          Components: Scheduler
>    Affects Versions: 1.5.1
>            Reporter: Neeraj Goel
>             Fix For: 1.5.1
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org