You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dexter Hu (Jira)" <ji...@apache.org> on 2020/10/07 07:45:00 UTC
[jira] [Updated] (SPARK-33083) optionally skip remote/local
dependency resolution at submission client in k8s cluster mode
[ https://issues.apache.org/jira/browse/SPARK-33083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dexter Hu updated SPARK-33083:
------------------------------
Summary: optionally skip remote/local dependency resolution at submission client in k8s cluster mode (was: support submission client remote/local dependency check in k8s cluster mode)
> optionally skip remote/local dependency resolution at submission client in k8s cluster mode
> -------------------------------------------------------------------------------------------
>
> Key: SPARK-33083
> URL: https://issues.apache.org/jira/browse/SPARK-33083
> Project: Spark
> Issue Type: Improvement
> Components: Kubernetes, Spark Submit
> Affects Versions: 2.4.7
> Reporter: Dexter Hu
> Priority: Minor
> Fix For: 2.4.7
>
>
> Usecase:
> # Users use Apache Livy to submit spark jobs. Livy pod's spark-submit command will be invoked to submit jobs to k8e cluster with *--deploy-mode cluster*
> # The Livy pod and future driver/executor pod have *different* permissions to access remote dependencies specified by --files, --jars, --py-files from S3, gcs, hdfs://
> Since the submission client host or Livy pod in this case don't have permissions to download remote files, it is possible to support options to *disable* remote resource resolution/download at Livy pod or submission client host?
> Of course, users will make sure these remote files are accessible by driver and executor pods with secure tokens, but not for Livy pod.
>
>
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org