You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by KhajaAsmath Mohammed <md...@gmail.com> on 2021/05/18 23:11:31 UTC

S3 Access Issues - Spark

Hi,

I have written a sample spark job that reads the data residing in hbase. I
keep getting below error , any suggestions to resolve this please?


Caused by: java.lang.IllegalArgumentException: AWS Access Key ID and Secret
Access Key must be specified by setting the fs.s3.awsAccessKeyId and
fs.s3.awsSecretAccessKey properties (respectively).
at org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:74)


      conf.set("fs.s3.impl", "org.apache.hadoop.fs.s3.S3FileSystem")
      conf.set("fs.s3.awsAccessKeyId", "ddd")
      conf.set("fs.s3.awsSecretAccessKey", "dddddd")

      conf.set("fs.s3n.impl",
"org.apache.hadoop.fs.s3native.NativeS3FileSystem")
      conf.set("fs.s3n.awsAccessKeyId", "xxxxxxx")
      conf.set("fs.s3n.awsSecretAccessKey", "xxxx")

 I tried this setting in spark config and hbase config but none of the
resolved my issue.

Thanks,
Asmath

Re: S3 Access Issues - Spark

Posted by Badrinath Patchikolla <ba...@modak.com>.
Hi  @KhajaAsmath Mohammed<ma...@gmail.com>,

Try these conf sets and see if you can access the s3 bucket.

spark.sparkContext.hadoopConfiguration.set("fs.s3a.access.key", aws.access_id)
spark.sparkContext.hadoopConfiguration.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
spark.sparkContext.hadoopConfiguration.set("fs.s3a.secret.key", aws.secret_access_key)


Thanks,

Badri

________________________________
From: KhajaAsmath Mohammed <md...@gmail.com>
Sent: 19 May 2021 04:41
To: user @spark <us...@spark.apache.org>
Subject: S3 Access Issues - Spark

Hi,

I have written a sample spark job that reads the data residing in hbase. I keep getting below error , any suggestions to resolve this please?


Caused by: java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified by setting the fs.s3.awsAccessKeyId and fs.s3.awsSecretAccessKey properties (respectively).
at org.apache.hadoop.fs.s3.S3Credentials.initialize(S3Credentials.java:74)


      conf.set("fs.s3.impl", "org.apache.hadoop.fs.s3.S3FileSystem")
      conf.set("fs.s3.awsAccessKeyId", "ddd")
      conf.set("fs.s3.awsSecretAccessKey", "dddddd")

      conf.set("fs.s3n.impl", "org.apache.hadoop.fs.s3native.NativeS3FileSystem")
      conf.set("fs.s3n.awsAccessKeyId", "xxxxxxx")
      conf.set("fs.s3n.awsSecretAccessKey", "xxxx")

 I tried this setting in spark config and hbase config but none of the resolved my issue.

Thanks,
Asmath