You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by rahulkumar-aws <ra...@gmail.com> on 2014/12/12 09:16:41 UTC

Re: Access to s3 from spark

Try Following any one  :
*1. Set the access key and secret key in the sparkContext:*
sparkContext.set("AWS_ACCESS_KEY_ID",yourAccessKey)
sparkContext.set("AWS_SECRET_ACCESS_KEY",yourSecretKey)

*2. Set the access key and secret key in the environment before starting
your application:*

export AWS_ACCESS_KEY_ID=<your access>
export AWS_SECRET_ACCESS_KEY=<your secret>​

* 3. Set the access key and secret key inside the hadoop configurations*
val hadoopConf=sparkContext.hadoopConfiguration;
hadoopConf.set("fs.s3.impl","org.apache.hadoop.fs.s3native.NativeS3FileSystem")
hadoopConf.set("fs.s3.awsAccessKeyId",yourAccessKey)
hadoopConf.set("fs.s3.awsSecretAccessKey",yourSecretKey)



-----
Software Developer
SigmoidAnalytics, Bangalore

--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Access-to-s3-from-spark-tp20631p20654.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org