You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gabor Somogyi (JIRA)" <ji...@apache.org> on 2019/01/29 10:53:00 UTC
[jira] [Commented] (SPARK-26766) Remove the list of filesystems
from HadoopDelegationTokenProvider.obtainDelegationTokens
[ https://issues.apache.org/jira/browse/SPARK-26766?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16754865#comment-16754865 ]
Gabor Somogyi commented on SPARK-26766:
---------------------------------------
[~vanzin] I was thinking about your [suggestion|https://github.com/apache/spark/pull/23499/files#diff-406f99efa37915001b613de47815e25cR54] and here are my findings.
* YarnSparkHadoopUtil.hadoopFSsToAccess covers everything which needed as you suggested
* On the other hand it's placed in YARN which makes it inaccessible from core
* Don't think it's good idea to move either the mentioned function or the token provider
I see mainly the following ways:
* Token provider asks the manager for file systems: I would prefer this
* Use some sort of init function where parameters can be passed: Not all the providers interested in this param (actually only FS)
* Split the token provider to YARN... Mesos... and implement the filesystem providing functions there: I think it's overkill
Unless you have better idea I'll go with the first one.
> Remove the list of filesystems from HadoopDelegationTokenProvider.obtainDelegationTokens
> ----------------------------------------------------------------------------------------
>
> Key: SPARK-26766
> URL: https://issues.apache.org/jira/browse/SPARK-26766
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 3.0.0
> Reporter: Gabor Somogyi
> Priority: Major
>
> This was discussed in previous PR [here|https://github.com/apache/spark/pull/23499/files#diff-406f99efa37915001b613de47815e25cR54].
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org