You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2021/07/09 19:23:42 UTC

[GitHub] [iceberg] kbendick edited a comment on pull request #2792: [SPARK] Allow spark catalogs to have hadoop configuration overrides p…

kbendick edited a comment on pull request #2792:
URL: https://github.com/apache/iceberg/pull/2792#issuecomment-877396902


   > I believe this is per catalog, not per table, right? Does it also support table level setting?
   > 
   > > E.g. for a table foo, to override a hadoop config property fs.s3a.max.connections, a config would be added to the spark session config via --conf spark.sql.catalog.foo.hadoop.fs.s3a.max.connections=4.
   
   That's correct. I have updated the description to mention catalog instead (as there's no way to change the hadoop config for multiple Iceberg tables in the same catalog).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org