You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/03/30 09:25:41 UTC
[jira] [Updated] (SPARK-20153) Support Multiple aws credentials in
order to access multiple Hive on S3 table in spark application
[ https://issues.apache.org/jira/browse/SPARK-20153?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-20153:
------------------------------
Labels: (was: enhancement)
Priority: Minor (was: Major)
Issue Type: Improvement (was: Bug)
> Support Multiple aws credentials in order to access multiple Hive on S3 table in spark application
> ---------------------------------------------------------------------------------------------------
>
> Key: SPARK-20153
> URL: https://issues.apache.org/jira/browse/SPARK-20153
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 2.0.1, 2.1.0
> Reporter: Franck Tago
> Priority: Minor
>
> I need to access multiple hive tables in my spark application where each hive table is
> 1- an external table with data sitting on S3
> 2- each table is own by a different AWS user so I need to provide different AWS credentials.
> I am familiar with setting the aws credentials in the hadoop configuration object but that does not really help me because I can only set one pair of (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )
> From my research , there is no easy or elegant way to do this in spark .
> Why is that ?
> How do I address this use case?
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org