You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2023/01/20 02:21:29 UTC

[GitHub] [iceberg] cgpoh commented on issue #6606: MinIO com.amazonaws.SdkClientException: Unable to execute HTTP request: Timeout waiting for connection from pool

cgpoh commented on issue #6606:
URL: https://github.com/apache/iceberg/issues/6606#issuecomment-1397847153

   After looking into the code, realised that instead of having s3.connection.maximum in flink configuration, I should set the values in Hadoop configuration and pass in the configuration to HiveCatalog instead.
   
   ```kotlin
   val conf = Configuration()
   conf["fs.s3a.connection.maximum"] = 100
   val catalogLoader = CatalogLoader.hive(hiveCatalogName, conf, properties)
   ```
   
   with this, I can see the maximum connection is set to 100 in the log. I will close this issue for now as my job is running for 24hrs


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org