You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "hemshankar sahu (JIRA)" <ji...@apache.org> on 2019/06/12 04:55:00 UTC

[jira] [Comment Edited] (SPARK-27891) Long running spark jobs fail because of HDFS delegation token expires

    [ https://issues.apache.org/jira/browse/SPARK-27891?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16861751#comment-16861751 ] 

hemshankar sahu edited comment on SPARK-27891 at 6/12/19 4:54 AM:
------------------------------------------------------------------

Updating this as critical, as we use spark streaming which is expected to run for more than 1 week.


was (Author: hemshankar_sahu):
Updating this as critical, as we use spark streaming which runs more than 1 week.

> Long running spark jobs fail because of HDFS delegation token expires
> ---------------------------------------------------------------------
>
>                 Key: SPARK-27891
>                 URL: https://issues.apache.org/jira/browse/SPARK-27891
>             Project: Spark
>          Issue Type: Bug
>          Components: Security
>    Affects Versions: 2.0.1, 2.1.0, 2.3.1, 2.4.1
>            Reporter: hemshankar sahu
>            Priority: Critical
>         Attachments: application_1559242207407_0001.log, spark_2.3.1_failure.log
>
>
> When the spark job runs on a secured cluster for longer then time that is mentioned in the dfs.namenode.delegation.token.renew-interval property of hdfs-site.xml the spark job fails. ** 
> Following command was used to submit the spark job
> bin/spark-submit --principal acekrbuser --keytab ~/keytabs/acekrbuser.keytab --master yarn --deploy-mode cluster examples/src/main/python/wordcount.py /tmp/ff1.txt
>  
> Application Logs attached
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org