You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Steve Loughran (Jira)" <ji...@apache.org> on 2020/02/04 13:29:00 UTC

[jira] [Reopened] (HADOOP-16838) Support for `fs.s3a.endpoint.region`

     [ https://issues.apache.org/jira/browse/HADOOP-16838?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Steve Loughran reopened HADOOP-16838:
-------------------------------------

> Support for `fs.s3a.endpoint.region`
> ------------------------------------
>
>                 Key: HADOOP-16838
>                 URL: https://issues.apache.org/jira/browse/HADOOP-16838
>             Project: Hadoop Common
>          Issue Type: New Feature
>            Reporter: Nitish
>            Priority: Major
>
> Currently it is not possible to connect S3 Compatible services like MinIO, Ceph, etc (running with a custom region) to Spark with s3a connector. For example, if MinIO is running on a Server with
>  * IP Address: 192.168.0.100
>  * Region: ap-southeast-1
> The s3a connector can't be configured to use the region `ap-southeast-1`. 
> It would be great to have a configuration field like `fs.s3a.endpoint.region`. This will be very helpful for users deploying Private Cloud and who intend to use S3 like services on premises.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org