You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Terry Siu <te...@dev9.com> on 2016/02/02 16:52:27 UTC

Hive table over S3 bucket with s3a

Hi,

I’m wondering if anyone has found a workaround for defining a Hive table over a S3 bucket when the secret access key has ‘/‘ characters in it. I’m using Hive 0.14 in HDP 2.2.4 and the statement that I used is:


CREATE EXTERNAL TABLE IF NOT EXISTS s3_foo (

  key INT, value STRING

)

ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t’

LOCATION 's3a://<access key>:<secret key>@<bucket>/<folder>’;


The following error is returned:


FAILED: IllegalArgumentException The bucketName parameter must be specified.


A workaround was to set the fs.s3a.access.key and fs.s3a.secret.key configuration and then change the location URL to be s3a://<bucket>/<folder>. However, this produces the following error:


FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain)


Has anyone found a way to create a Hive over S3 table when the key contains ‘/‘ characters or it just standard practice to simply regenerate the keys until IAM returns one that doesn’t have the offending characters?


Thanks,

-Terry

Re: Hive table over S3 bucket with s3a

Posted by Terry Siu <te...@dev9.com>.
Yeah, that’s what I thought. I found this: https://issues.apache.org/jira/browse/HADOOP-3733. Posted a couple of questions there, but prior to that, the last comment was over a year ago. Thanks for the response!

-Terry

From: Elliot West <te...@gmail.com>>
Reply-To: "user@hive.apache.org<ma...@hive.apache.org>" <us...@hive.apache.org>>
Date: Tuesday, February 2, 2016 at 7:57 AM
To: "user@hive.apache.org<ma...@hive.apache.org>" <us...@hive.apache.org>>
Subject: Re: Hive table over S3 bucket with s3a

When I last looked at this it was recommended to simply regenerate the key as you suggest.

On 2 February 2016 at 15:52, Terry Siu <te...@dev9.com>> wrote:
Hi,

I’m wondering if anyone has found a workaround for defining a Hive table over a S3 bucket when the secret access key has ‘/‘ characters in it. I’m using Hive 0.14 in HDP 2.2.4 and the statement that I used is:


CREATE EXTERNAL TABLE IF NOT EXISTS s3_foo (

  key INT, value STRING

)

ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t’

LOCATION 's3a://<access key>:<secret key>@<bucket>/<folder>’;


The following error is returned:


FAILED: IllegalArgumentException The bucketName parameter must be specified.


A workaround was to set the fs.s3a.access.key and fs.s3a.secret.key configuration and then change the location URL to be s3a://<bucket>/<folder>. However, this produces the following error:


FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:com.amazonaws.AmazonClientException: Unable to load AWS credentials from any provider in the chain)


Has anyone found a way to create a Hive over S3 table when the key contains ‘/‘ characters or it just standard practice to simply regenerate the keys until IAM returns one that doesn’t have the offending characters?


Thanks,

-Terry


Re: Hive table over S3 bucket with s3a

Posted by Elliot West <te...@gmail.com>.
When I last looked at this it was recommended to simply regenerate the key
as you suggest.

On 2 February 2016 at 15:52, Terry Siu <te...@dev9.com> wrote:

> Hi,
>
> I’m wondering if anyone has found a workaround for defining a Hive table
> over a S3 bucket when the secret access key has ‘/‘ characters in it. I’m
> using Hive 0.14 in HDP 2.2.4 and the statement that I used is:
>
>
> CREATE EXTERNAL TABLE IF NOT EXISTS s3_foo (
>
>   key INT, value STRING
>
> )
>
> ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t’
>
> LOCATION 's3a://<access key>:<secret key>@<bucket>/<folder>’;
>
>
> The following error is returned:
>
>
> FAILED: IllegalArgumentException The bucketName parameter must be
> specified.
>
>
> A workaround was to set the fs.s3a.access.key and fs.s3a.secret.key
> configuration and then change the location URL to be
> s3a://<bucket>/<folder>. However, this produces the following error:
>
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask.
> MetaException(message:com.amazonaws.AmazonClientException: Unable to load
> AWS credentials from any provider in the chain)
>
>
> Has anyone found a way to create a Hive over S3 table when the key
> contains ‘/‘ characters or it just standard practice to simply regenerate
> the keys until IAM returns one that doesn’t have the offending characters?
>
>
> Thanks,
>
> -Terry
>