You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Aniket Bhatnagar (JIRA)" <ji...@apache.org> on 2014/11/04 19:30:34 UTC

[jira] [Commented] (SPARK-3640) KinesisUtils should accept a credentials object instead of forcing DefaultCredentialsProvider

    [ https://issues.apache.org/jira/browse/SPARK-3640?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14196534#comment-14196534 ] 

Aniket Bhatnagar commented on SPARK-3640:
-----------------------------------------

Thanks Chris for looking into this. This documentation would certainly be useful. However, the problem that I am facing with using DefaultCredentialsProvider is that each node in the cluster needs to be setup to have those credentials in the user's home directory and this is a bit tedious. I would like the driver to pass credentials to all nodes in the cluster to avoid such an operational overhead. I have submitted a pull request that contains the changes I had to make to allow driver to pass user defined credentials. Do have a look and let me know if there is a better way.

https://github.com/apache/spark/pull/3092

> KinesisUtils should accept a credentials object instead of forcing DefaultCredentialsProvider
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-3640
>                 URL: https://issues.apache.org/jira/browse/SPARK-3640
>             Project: Spark
>          Issue Type: Improvement
>          Components: Streaming
>    Affects Versions: 1.1.0
>            Reporter: Aniket Bhatnagar
>              Labels: kinesis
>
> KinesisUtils should accept AWS Credentials as a parameter and should default to DefaultCredentialsProvider if no credentials are provided. Currently, the implementation forces usage of DefaultCredentialsProvider which can be a pain especially when jobs are run by multiple  unix users.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org