You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/02/12 19:38:00 UTC

[jira] [Resolved] (SPARK-26789) [k8s] pyspark needs to upload local resources to driver and executor pods

     [ https://issues.apache.org/jira/browse/SPARK-26789?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-26789.
------------------------------------
    Resolution: Duplicate

> [k8s] pyspark needs to upload local resources to driver and executor pods
> -------------------------------------------------------------------------
>
>                 Key: SPARK-26789
>                 URL: https://issues.apache.org/jira/browse/SPARK-26789
>             Project: Spark
>          Issue Type: New Feature
>          Components: Kubernetes, PySpark
>    Affects Versions: 2.4.0
>            Reporter: Oleg Frenkel
>            Priority: Major
>
> Kubernetes support provided with [https://github.com/apache-spark-on-k8s/spark] allows local dependencies to be used with cluster deployment mode. Specifically, the Resource Staging Server is used in order to upload local dependencies to Kubernetes so that driver and executor pods can download these dependencies. It looks that Spark 2.4.0 release does not support local dependencies. 
> For example, the following command is expected to automatically upload pi.py from local machine to the Kubernetes cluster and make it available for both driver and executor pods:
> {{bin/spark-submit --conf spark.app.name=example.python.pi --master k8s://http://127.0.0.1:8001 --deploy-mode cluster --conf spark.kubernetes.container.image=spark-py:spark-2.4.0 ./examples/src/main/python/pi.py}}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org