You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@druid.apache.org by "a2l007 (via GitHub)" <gi...@apache.org> on 2023/03/13 23:30:44 UTC

[GitHub] [druid] a2l007 commented on issue #13923: Proposal: Druid extension to read and ingest Iceberg data files

a2l007 commented on issue #13923:
URL: https://github.com/apache/druid/issues/13923#issuecomment-1467118642

   @gianm Thank you for taking at look. Designing this as an `InputSource` is something I'd considered but my concern is that wouldn't it create a dependency across extensions? If the IcebergInputSource implicitly identifies that it needs to delegate to HdfsInputSource, wouldn't the hdfs extension need to be available to the iceberg extension in compile-time?
   
   Or are you suggesting that `IcebergInputSource` would be similar to a `CombiningInputSource` design where the delegate input source is also specified in the spec like:
   
   "ioConfig": {
         "type": "index_parallel",
         "inputSource": {
           "type": "iceberg",
           "tableName": "logs",
           "namespace": "webapp",
           "partitionColumn": "event_time",
           "intervals": ["2023-01-26T00:00:00.000Z/2023-02-18T00:00:00.000Z"],
          "delegateInputSource" : "hdfs"
         }
   }


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org