You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (Jira)" <ji...@apache.org> on 2019/08/20 19:46:00 UTC

[jira] [Updated] (SPARK-27937) Revert changes introduced as a part of Automatic namespace discovery [SPARK-24149]

     [ https://issues.apache.org/jira/browse/SPARK-27937?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin updated SPARK-27937:
-----------------------------------
    Docs Text: In Spark 3.0, the behavior for automatic delegation token retrieval for file systems is the same as Spark 2.3. Users need to explicitly include the URIs they want to access in the spark.kerberos.access.hadoopFileSystems configuration. The automatic discovery added in Spark 2.4 (SPARK-24149) was removed.
       Labels: release-notes  (was: )

> Revert changes introduced as a part of Automatic namespace discovery [SPARK-24149]
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-27937
>                 URL: https://issues.apache.org/jira/browse/SPARK-27937
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.3
>            Reporter: Dhruve Ashar
>            Assignee: Dhruve Ashar
>            Priority: Major
>              Labels: release-notes
>             Fix For: 3.0.0
>
>
> Spark fails to launch for a valid deployment of HDFS while trying to get tokens for a logical nameservice instead of an actual namenode (with HDFS federation enabled). 
> On inspecting the source code closely, it is unclear why we were doing it and based on the context from SPARK-24149, it solves a very specific use case of getting the tokens for only those namenodes which are configured for HDFS federation in the same cluster. IMHO these are better left to the user to specify explicitly.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org