You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Erik Erlandson (JIRA)" <ji...@apache.org> on 2018/06/13 23:38:00 UTC

[jira] [Commented] (SPARK-24534) Add a way to bypass entrypoint.sh script if no spark cmd is passed

    [ https://issues.apache.org/jira/browse/SPARK-24534?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16511804#comment-16511804 ] 

Erik Erlandson commented on SPARK-24534:
----------------------------------------

I think this has potential use for customization beyond the openshift downstream. It allows derived images to leverage the apache spark base images in contexts outside of directly running the driver and executor processes.

> Add a way to bypass entrypoint.sh script if no spark cmd is passed
> ------------------------------------------------------------------
>
>                 Key: SPARK-24534
>                 URL: https://issues.apache.org/jira/browse/SPARK-24534
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes
>    Affects Versions: 2.3.0
>            Reporter: Ricardo Martinelli de Oliveira
>            Priority: Minor
>
> As an improvement in the entrypoint.sh script, I'd like to propose spark entrypoint do a passthrough if driver/executor/init is not the command passed. Currently it raises an error.
> To me more specific, I'm talking about these lines:
> [https://github.com/apache/spark/blob/master/resource-managers/kubernetes/docker/src/main/dockerfiles/spark/entrypoint.sh#L113-L114]
> This allows the openshift-spark image to continue to function as a Spark Standalone component, with custom configuration support etc. without compromising the previous method to configure the cluster inside a kubernetes environment.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org