You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Franck Tago (JIRA)" <ji...@apache.org> on 2017/03/30 05:54:41 UTC

[jira] [Created] (SPARK-20153) Support Multiple aws credentials in order to access multiple Hive on S3 table in spark application

Franck Tago created SPARK-20153:
-----------------------------------

             Summary: Support Multiple aws credentials in order to access multiple Hive on S3 table in spark application 
                 Key: SPARK-20153
                 URL: https://issues.apache.org/jira/browse/SPARK-20153
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.1.0, 2.0.1
            Reporter: Franck Tago


I need to access multiple hive tables in my spark application where each hive table is 
1- an external table with data sitting on S3
2- each table is own by a different AWS user so I need to provide different AWS credentials. 

I am familiar with setting the aws credentials in the hadoop configuration object but that does not really help me because I can only set one pair of (fs.s3a.awsAccessKeyId , fs.s3a.awsSecretAccessKey )

From my research , there is no easy or elegant way to do this in spark .

Why is that ?  

How do I address this use case?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org