You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2020/12/01 03:24:00 UTC

[GitHub] [iceberg] johnclara commented on a change in pull request #1844: AWS: support custom credentials, region and assume role functionality

johnclara commented on a change in pull request #1844:
URL: https://github.com/apache/iceberg/pull/1844#discussion_r533048395



##########
File path: core/src/main/java/org/apache/iceberg/CatalogUtil.java
##########
@@ -171,45 +172,57 @@ public static Catalog loadCatalog(
 
   /**
    * Load a custom {@link FileIO} implementation.
+   */
+  public static FileIO loadFileIO(
+      String impl,
+      Map<String, String> properties,
+      Configuration hadoopConf) {
+    return loadCatalogConfigurable(impl, properties, hadoopConf, FileIO.class);
+  }
+
+  /**
+   * Load a custom implementation of a {@link CatalogConfigurable}.
    * <p>
    * The implementation must have a no-arg constructor.
-   * If the class implements {@link Configurable},
+   * If the class implements Hadoop's {@link Configurable},
    * a Hadoop config will be passed using {@link Configurable#setConf(Configuration)}.
-   * {@link FileIO#initialize(Map properties)} is called to complete the initialization.
+   * {@link CatalogConfigurable#initialize(Map properties)} is called to complete the initialization.
    *
    * @param impl full class name of a custom FileIO implementation
    * @param hadoopConf hadoop configuration
+   * @param resultClass the final return class type
    * @return FileIO class
    * @throws IllegalArgumentException if class path not found or
    *  right constructor not found or
    *  the loaded class cannot be casted to the given interface type
    */
-  public static FileIO loadFileIO(
+  public static <T extends CatalogConfigurable> T loadCatalogConfigurable(
       String impl,
       Map<String, String> properties,
-      Configuration hadoopConf) {
-    LOG.info("Loading custom FileIO implementation: {}", impl);
-    DynConstructors.Ctor<FileIO> ctor;
+      Configuration hadoopConf,
+      Class<T> resultClass) {
+    LOG.info("Loading custom {} implementation: {}", resultClass.getName(), impl);
+    DynConstructors.Ctor<T> ctor;
     try {
-      ctor = DynConstructors.builder(FileIO.class).impl(impl).buildChecked();
+      ctor = DynConstructors.builder(resultClass).impl(impl).buildChecked();

Review comment:
       You're right, serialization is unrelated. I'm probably missing stuff, but my understanding of this PR is that it's trying to mimic being able to pass an aws service client builder through to the catalog?
   
   But since the catalog only can get passed Map<String, String>, what you do is pass a custom builder class name and a flat set of args. Then when it wants the builder, it will instantiate your builder class and pass you back the set of args which you're expected to put back into the right places.
   
   I think an example is if I want the catalog to use this config for it's dynamo client:
   ```
   Dynamo.builder()
     .withRegion("myRegion")
     .withProxyConfig(
       ProxyConfig
       .withPort(1000)
       .withHost("localhost")
       .build())
     .build()
   ```
   This pr is letting a user create this:
   ```
   CustomDynamoBuilder {
      public Dynamo dynamo;
      public void initialize(Map<String, String> props) {
        dynamo = Dynamo.builder()
   .withRegion(props.get("region")
   .withProxyConfig(ProxyConfig
     .withPort(PropertyUtils.propertyAsInt(port, "port"))
     .withHost(props.get("host"))
     .build();
      }
   }
   ```
   Then the user is expected to push through into the source: Map(dynamoClass -> CustomDynamoBuilder, region -> myRegion, port -> 1000, host -> localhost).
   
   So the user will have to reimplement the aws builders for every service they need to customize by making a mapping from a flat string list to every property in the config.
   
   Is this basically that, except that it's more modularized by letting you define a ProxyConfigBuilder which will then try to update each custom builder you make with that proxy config if it's allowed (and other things like proxies aka credentials)?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org