You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Glöckner (Jira)" <ji...@apache.org> on 2022/11/02 14:34:00 UTC

[jira] [Comment Edited] (SPARK-33782) Place spark.files, spark.jars and spark.files under the current working directory on the driver in K8S cluster mode

    [ https://issues.apache.org/jira/browse/SPARK-33782?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17627743#comment-17627743 ] 

Daniel Glöckner edited comment on SPARK-33782 at 11/2/22 2:33 PM:
------------------------------------------------------------------

Will this fix repair the {{--jars}} flag and will JARs be added automatically to the driver and executor class path when using {{spark.kubernetes.file.upload.path}} / {{file://}} URIs?

https://spark.apache.org/docs/latest/running-on-kubernetes.html#dependency-management

https://spark.apache.org/docs/3.2.0/submitting-applications.html
??
When using spark-submit, the application jar along with any jars included with the --jars option will be automatically transferred to the cluster. URLs supplied after --jars must be separated by commas. That list is included in the driver and executor classpaths. ??


was (Author: JIRAUSER288949):
The this fix repair the {{--jars}} flag and will JARs be added automatically to the driver and executor class path when using {{spark.kubernetes.file.upload.path}} / {{file://}} URIs?

https://spark.apache.org/docs/latest/running-on-kubernetes.html#dependency-management

https://spark.apache.org/docs/3.2.0/submitting-applications.html
??
When using spark-submit, the application jar along with any jars included with the --jars option will be automatically transferred to the cluster. URLs supplied after --jars must be separated by commas. That list is included in the driver and executor classpaths. ??

> Place spark.files, spark.jars and spark.files under the current working directory on the driver in K8S cluster mode
> -------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-33782
>                 URL: https://issues.apache.org/jira/browse/SPARK-33782
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 3.2.0
>            Reporter: Hyukjin Kwon
>            Priority: Major
>
> In Yarn cluster modes, the passed files are able to be accessed in the current working directory. Looks like this is not the case in Kubernates cluset mode.
> By doing this, users can, for example, leverage PEX to manage Python dependences in Apache Spark:
> {code}
> pex pyspark==3.0.1 pyarrow==0.15.1 pandas==0.25.3 -o myarchive.pex
> PYSPARK_PYTHON=./myarchive.pex spark-submit --files myarchive.pex
> {code}
> See also https://github.com/apache/spark/pull/30735/files#r540935585.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org