You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/07/09 17:38:00 UTC
[jira] [Resolved] (SPARK-32035) Inconsistent AWS environment
variables in documentation
[ https://issues.apache.org/jira/browse/SPARK-32035?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-32035.
-----------------------------------
Fix Version/s: 3.1.0
2.4.7
3.0.1
Resolution: Fixed
Issue resolved by pull request 29058
[https://github.com/apache/spark/pull/29058]
> Inconsistent AWS environment variables in documentation
> -------------------------------------------------------
>
> Key: SPARK-32035
> URL: https://issues.apache.org/jira/browse/SPARK-32035
> Project: Spark
> Issue Type: Bug
> Components: Documentation
> Affects Versions: 2.4.6, 3.0.0
> Reporter: Ondrej Kokes
> Priority: Minor
> Fix For: 3.0.1, 2.4.7, 3.1.0
>
>
> Looking at the actual Scala code, the environment variables used to log into AWS are:
> - AWS_ACCESS_KEY_ID
> - AWS_SECRET_ACCESS_KEY
> - AWS_SESSION_TOKEN
> These are the same that AWS uses in their libraries.
> However, looking through the Spark documentation and comments, I see that these are not denoted correctly across the board:
> docs/cloud-integration.md
> 106:1. `spark-submit` reads the `AWS_ACCESS_KEY`, `AWS_SECRET_KEY` *<-- both different*
> 107:and `AWS_SESSION_TOKEN` environment variables and sets the associated authentication options
> docs/streaming-kinesis-integration.md
> 232:- Set up the environment variables `AWS_ACCESS_KEY_ID` and `AWS_SECRET_KEY` with your AWS credentials. *<-- secret key different*
> external/kinesis-asl/src/main/python/examples/streaming/kinesis_wordcount_asl.py
> 34: $ export AWS_ACCESS_KEY_ID=<your-access-key>
> 35: $ export AWS_SECRET_KEY=<your-secret-key> *<-- different*
> 48: Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_KEY *<-- secret key different*
> core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala
> 438: val keyId = System.getenv("AWS_ACCESS_KEY_ID")
> 439: val accessKey = System.getenv("AWS_SECRET_ACCESS_KEY")
> 448: val sessionToken = System.getenv("AWS_SESSION_TOKEN")
> external/kinesis-asl/src/main/scala/org/apache/spark/examples/streaming/KinesisWordCountASL.scala
> 53: * $ export AWS_ACCESS_KEY_ID=<your-access-key>
> 54: * $ export AWS_SECRET_KEY=<your-secret-key> *<-- different*
> 65: * Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_KEY *<-- secret key different*
> external/kinesis-asl/src/main/java/org/apache/spark/examples/streaming/JavaKinesisWordCountASL.java
> 59: * $ export AWS_ACCESS_KEY_ID=[your-access-key]
> 60: * $ export AWS_SECRET_KEY=<your-secret-key> *<-- different*
> 71: * Environment Variables - AWS_ACCESS_KEY_ID and AWS_SECRET_KEY *<-- secret key different*
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org