You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "rameshkrishnan muthusamy (Jira)" <ji...@apache.org> on 2020/08/11 14:17:00 UTC

[jira] [Comment Edited] (SPARK-27997) kubernetes client token expired

    [ https://issues.apache.org/jira/browse/SPARK-27997?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17175601#comment-17175601 ] 

rameshkrishnan muthusamy edited comment on SPARK-27997 at 8/11/20, 2:16 PM:
----------------------------------------------------------------------------

[~phoerious] I have completed the OIDC integration required, working on the UT for the same. You should have the PR in the coming week. The driver would take the necessary details of a OIDC/Auth provider and would  be using the same to  automatically refresh the token when making the request to the Kubernetes API. 


was (Author: ramkrish1489):
[~phoerious] I have completed the OIDC integration required, working on the UT for the same. You should have the PR in the coming week. 

> kubernetes client token expired 
> --------------------------------
>
>                 Key: SPARK-27997
>                 URL: https://issues.apache.org/jira/browse/SPARK-27997
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes, Spark Core
>    Affects Versions: 3.1.0
>            Reporter: Henry Yu
>            Priority: Major
>
> Hi ,
> when I try to submit spark to k8s in cluster mode, I need an authtoken to talk with k8s.
> unfortunately, many cloud provider provide token and expired with 10-15 mins. so we need to fresh this token.  
> client mode is event worse, because scheduler is created on submit process.
> Should I also make a pr on this ? I fix it by adding 
> RotatingOAuthTokenProvider and some configuration.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org