You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Steve Loughran (Jira)" <ji...@apache.org> on 2020/02/21 14:04:01 UTC
[jira] [Resolved] (SPARK-24000) S3A: Create Table should fail on
invalid AK/SK
[ https://issues.apache.org/jira/browse/SPARK-24000?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Steve Loughran resolved SPARK-24000.
------------------------------------
Resolution: Done
closing as done but not giving a spark version, as it just depends on which hadoop version yo use, rather than a specific spark release
> S3A: Create Table should fail on invalid AK/SK
> ----------------------------------------------
>
> Key: SPARK-24000
> URL: https://issues.apache.org/jira/browse/SPARK-24000
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 2.3.0
> Reporter: Brahma Reddy Battula
> Priority: Major
>
> Currently, When we pass the i{color:#FF0000}nvalid ak&&sk{color} *create table* will be the *success*.
> when the S3AFileSystem is initialized, *verifyBucketExists*() is called, which will return *True* as the status code 403 (*_BUCKET_ACCESS_FORBIDDEN_STATUS_CODE)_* _from following as bucket exists._
> {code:java}
> public boolean doesBucketExist(String bucketName)
> throws AmazonClientException, AmazonServiceException {
>
> try {
> headBucket(new HeadBucketRequest(bucketName));
> return true;
> } catch (AmazonServiceException ase) {
> // A redirect error or a forbidden error means the bucket exists. So
> // returning true.
> if ((ase.getStatusCode() == Constants.BUCKET_REDIRECT_STATUS_CODE)
> || (ase.getStatusCode() == Constants.BUCKET_ACCESS_FORBIDDEN_STATUS_CODE)) {
> return true;
> }
> if (ase.getStatusCode() == Constants.NO_SUCH_BUCKET_STATUS_CODE) {
> return false;
> }
> throw ase;
>
> }
> }{code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org