You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Pedro Gonçalves Rossi Rodrigues (Jira)" <ji...@apache.org> on 2020/01/31 17:03:00 UTC
[jira] [Commented] (SPARK-25355) Support --proxy-user for Spark on
K8s
[ https://issues.apache.org/jira/browse/SPARK-25355?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17027665#comment-17027665 ]
Pedro Gonçalves Rossi Rodrigues commented on SPARK-25355:
---------------------------------------------------------
[~vanzin] I made a simple test regarding this issue, I tested the spark-submit with the proxy user option and checked the container args that were generated, and it did not include the --proxy-user option, so I made a copy of the driver pod and added the --proxy-user option and it worked! Basically is just passing the proxy user argument to the driver command if the cluster type is kubernetes, I am going to submit a patch for this issue.
> Support --proxy-user for Spark on K8s
> -------------------------------------
>
> Key: SPARK-25355
> URL: https://issues.apache.org/jira/browse/SPARK-25355
> Project: Spark
> Issue Type: New Feature
> Components: Kubernetes
> Affects Versions: 3.0.0
> Reporter: Stavros Kontopoulos
> Priority: Major
>
> SPARK-23257 adds kerberized hdfs support for Spark on K8s. A major addition needed is the support for proxy user. A proxy user is impersonated by a superuser who executes operations on behalf of the proxy user. More on this:
> [https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/Superusers.html]
> [https://github.com/spark-notebook/spark-notebook/blob/master/docs/proxyuser_impersonation.md]
> This has been implemented for Yarn upstream and Spark on Mesos here:
> [https://github.com/mesosphere/spark/pull/26]
> [~ifilonenko] creating this issue according to our discussion.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org