You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ian O Connell (JIRA)" <ji...@apache.org> on 2018/11/30 15:03:00 UTC

[jira] [Comment Edited] (SPARK-26043) Make SparkHadoopUtil private to Spark

    [ https://issues.apache.org/jira/browse/SPARK-26043?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16704829#comment-16704829 ] 

Ian O Connell edited comment on SPARK-26043 at 11/30/18 3:02 PM:
-----------------------------------------------------------------

With this change it makes it difficult to get a fully populated hadoop configuration on executor hosts. If spark properties were applied to the hadoop conf in the driver those don't show up in a `new Configuration`. This let one go SparkEnv -> SparkHadoopUtil -> Configuration thats fully populated. Is there a nicer way to achieve this possibly?

 

(context is via the command line/in spark i had been setting hadoop configuration options – but i need to pick those up in some libraries on the executors to see what was set (if s3guard is enabled in my case). I need some means to hook into what the submitter though the hadoop conf should be to turn on/off reporting to dynamodb for s3guard info)


was (Author: ianoc):
With this change it makes it difficult to get a fully populated hadoop configuration on executor hosts. If spark properties were applied to the hadoop conf in the driver those don't show up in a `new Configuration`. This let one go SparkEnv -> SparkHadoopUtil -> Configuration thats fully populated. Is there a nicer way to achieve this possibly?

> Make SparkHadoopUtil private to Spark
> -------------------------------------
>
>                 Key: SPARK-26043
>                 URL: https://issues.apache.org/jira/browse/SPARK-26043
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Marcelo Vanzin
>            Assignee: Sean Owen
>            Priority: Minor
>              Labels: release-notes
>             Fix For: 3.0.0
>
>
> This API contains a few small helper methods used internally by Spark, mostly related to Hadoop configs and kerberos.
> It's been historically marked as "DeveloperApi". But in reality it's not very useful for others, and changes a lot to be considered a stable API. Better to just make it private to Spark.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org