You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/02/15 20:50:00 UTC
[jira] [Resolved] (SPARK-26597) Support using images with different
entrypoints on Kubernetes
[ https://issues.apache.org/jira/browse/SPARK-26597?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marcelo Vanzin resolved SPARK-26597.
------------------------------------
Resolution: Duplicate
I'm consolidating all docker image-related bugs under SPARK-24655. All discussion about requirements that people have around docker images should go there.
> Support using images with different entrypoints on Kubernetes
> -------------------------------------------------------------
>
> Key: SPARK-26597
> URL: https://issues.apache.org/jira/browse/SPARK-26597
> Project: Spark
> Issue Type: New Feature
> Components: Kubernetes
> Affects Versions: 2.4.0
> Reporter: Patrick Clay
> Priority: Minor
>
> I wish to use arbitrary pre-existing docker images containing Spark with Kubernetes.
> Specifically I wish to use [jupyter/all-spark-notebook|https://hub.docker.com/r/jupyter/all-spark-notebook] in in-cluster client mode ideally without image modification (I think using images maintained by others is a key advantage of Docker).
> It has the full Spark 2.4 binary tarball with Spark's entrypoint.sh in it, but I need to create a child image setting that as the entrypoint to it, because Spark does not let the user specify a k8s command. I needed separate images for kernel / driver and executor, because I need the kernel to have Jupyter's entrypoint. Building and setting the executor image works, but is obnoxious just to set the entrypoint.
> The crux of this FR is to add a property for executor (and driver) command to point to entrypoint.sh
> I personally don't see why you even have entrypoint.sh instead of making the command be _spark-class org.apache.spark.executor.CoarseGrainedExecutorBackend ..._, which seems a lot more portable (albeit reliant on the PATH).
> Speaking of reliance on PATH it also broke, because they didn't set JAVA_HOME, and install tini from Conda putting it on a different path. These are smaller issues. I'll file an issue on them and try to work them out between here and there.
> In general shouldn't Spark on k8s be less coupled to the layout of the image?
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org