You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Uri Laserson (JIRA)" <ji...@apache.org> on 2015/04/28 00:20:39 UTC

[jira] [Commented] (SPARK-7177) Create standard way to wrap Spark CLI scripts for external projects

    [ https://issues.apache.org/jira/browse/SPARK-7177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14515131#comment-14515131 ] 

Uri Laserson commented on SPARK-7177:
-------------------------------------

cc [~vanzin]

> Create standard way to wrap Spark CLI scripts for external projects
> -------------------------------------------------------------------
>
>                 Key: SPARK-7177
>                 URL: https://issues.apache.org/jira/browse/SPARK-7177
>             Project: Spark
>          Issue Type: Improvement
>            Reporter: Uri Laserson
>
> Many external projects that are built on Spark support CLI scripts to launch their applications.  For example, the ADAM project has {{adam-submit}}, {{adam-shell}}, and {{adam-pyspark}} commands that mirror the Spark versions but modify the CLASSPATH and set some extra options to make it easy for the user.  However, because these applications can take a mix of Spark-specific and application-specific options, they need to use the internal CLI tools to separate them out before calling {{spark-submit}} or {{spark-shell}} (e.g., using {{gatherSparkSubmitOpts}}).  However, because this functionality is considered internal, it has changed a few times in the past.
> It would be great if there was a stable way that could be an "extensibility" API for people to make shell wrappers that would be unlikely to need significant changes when the Spark functionality changes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org