You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Saisai Shao (JIRA)" <ji...@apache.org> on 2016/12/13 09:30:58 UTC

[jira] [Comment Edited] (SPARK-18840) HDFSCredentialProvider throws exception in non-HDFS security environment

    [ https://issues.apache.org/jira/browse/SPARK-18840?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15744370#comment-15744370 ] 

Saisai Shao edited comment on SPARK-18840 at 12/13/16 9:30 AM:
---------------------------------------------------------------

This problem also existed in branch 1.6, but the fix is a little different compared to master.


was (Author: jerryshao):
This problem also existed in branch 1.6, but the fix is a little complicated compared to master.

> HDFSCredentialProvider throws exception in non-HDFS security environment
> ------------------------------------------------------------------------
>
>                 Key: SPARK-18840
>                 URL: https://issues.apache.org/jira/browse/SPARK-18840
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.6.3, 2.1.0
>            Reporter: Saisai Shao
>            Priority: Minor
>
> Current in {{HDFSCredentialProvider}}, the code logic assumes HDFS delegation token should be existed, this is ok for HDFS environment, but for some cloud environment like Azure, HDFS is not required, so it will throw exception:
> {code}
> java.util.NoSuchElementException: head of empty list
>         at scala.collection.immutable.Nil$.head(List.scala:337)
>         at scala.collection.immutable.Nil$.head(List.scala:334)
>         at org.apache.spark.deploy.yarn.Client.getTokenRenewalInterval(Client.scala:627)
> {code}
> We should also consider this situation.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org