You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Chesnay Schepler (Jira)" <ji...@apache.org> on 2019/12/12 13:28:00 UTC

[jira] [Closed] (FLINK-15215) Not able to provide a custom AWS credentials provider with flink-s3-fs-hadoop

     [ https://issues.apache.org/jira/browse/FLINK-15215?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Chesnay Schepler closed FLINK-15215.
------------------------------------
    Resolution: Duplicate

FLINK-11956 aims to remove shading from the S3 filesystems, which allow you to easily write custom credential providers.

In the meantime, as an admittedly inconvenient workaround, you could implement your provider against the relocated classes within the corresponding S3 filesystem jar.

> Not able to provide a custom AWS credentials provider with flink-s3-fs-hadoop
> -----------------------------------------------------------------------------
>
>                 Key: FLINK-15215
>                 URL: https://issues.apache.org/jira/browse/FLINK-15215
>             Project: Flink
>          Issue Type: Bug
>          Components: FileSystems
>    Affects Versions: 1.9.0
>            Reporter: Tank
>            Priority: Major
>
> I am using Flink 1.9.0 on EMR (emr 5.28), with StreamingFileSink using an S3 filesystem.
> I want flink to write to an S3 bucket which is running in another AWS account, and I want to do that by assuming a role in the other account. This can be easily accomplished by providing a custom credential provider (similar to [https://aws.amazon.com/blogs/big-data/securely-analyze-data-from-another-aws-account-with-emrfs/])
>  
> As described in https://ci.apache.org/projects/flink/flink-docs-stable/ops/filesystems/#pluggable-file-systems, I copied {{flink-s3-fs-hadoop-1.9.0.jar}} to the plugins directory. But the configuration parameter 'fs.s3a.aws.credentials.provider' is getting shaded [https://github.com/apache/flink/blob/master/flink-filesystems/flink-s3-fs-hadoop/src/main/java/org/apache/flink/fs/s3hadoop/S3FileSystemFactory.java#L47]}}, and so are all the aws sdk dependencies, so when I provide a custom credential provider, it complained that I was not implementing the correct interface (AWSCredentialsProvider) 
> The fix made in https://issues.apache.org/jira/browse/FLINK-13044 allows users to use one of the built-in credential providers like `_InstanceProfileCredentialsProvider`,_ but still does not help with providing custom credential providers.
>  
> Related: https://issues.apache.org/jira/browse/FLINK-13602



--
This message was sent by Atlassian Jira
(v8.3.4#803005)