You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Carl Jones (Jira)" <ji...@apache.org> on 2022/07/04 09:59:00 UTC
[jira] [Commented] (SPARK-38958) Override S3 Client in Spark Write/Read calls
[ https://issues.apache.org/jira/browse/SPARK-38958?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17562115#comment-17562115 ]
Daniel Carl Jones commented on SPARK-38958:
-------------------------------------------
I'm not aware of documented support for this right now.
Is your use-case limited to just setting static headers for all S3 requests made by your Spark application - i.e. do you expect to have the same header key and values every time or would you need to vary based on a given file/request?
> Override S3 Client in Spark Write/Read calls
> --------------------------------------------
>
> Key: SPARK-38958
> URL: https://issues.apache.org/jira/browse/SPARK-38958
> Project: Spark
> Issue Type: New Feature
> Components: Spark Core
> Affects Versions: 3.2.1
> Reporter: Hershal
> Priority: Major
>
> Hello,
> I have been working to use spark to read and write data to S3. Unfortunately, there are a few S3 headers that I need to add to my spark read/write calls. After much looking, I have not found a way to replace the S3 client that spark uses to make the read/write calls. I also have not found a configuration that allows me to pass in S3 headers. Here is an example of some common S3 request headers ([https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonRequestHeaders.html).] Does there already exist functionality to add S3 headers to spark read/write calls or pass in a custom client that would pass these headers on every read/write request? Appreciate the help and feedback
>
> Thanks,
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org