You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/09/22 17:59:14 UTC

[GitHub] [hudi] Xiaohan-Shen opened a new issue, #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Xiaohan-Shen opened a new issue, #6747:
URL: https://github.com/apache/hudi/issues/6747

   **Describe the problem you faced**
   
   When I tried to query _rt table using `select count(*) from table_rt` through Hive or Spark SQL, an exception is thrown saying AWSDmsAvroPayload not found. If I query _ro table or CoW table, it's working well. 
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. Start EMR 6.5.0, run DeltaStreamer to load a Hudi MoR table from DMS with these configs: 
   ```
   spark-submit 
   --jars /usr/lib/spark/external/lib/spark-avro.jar,/usr/lib/hudi/hudi-spark-bundle.jar,/usr/lib/hudi/hudi-utilities-bundle.jar 
   --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer 
   --packages org.apache.hudi:hudi-spark-bundle_2.12:0.9.0,org.apache.spark:spark-avro_2.12:3.0.1 
   --master yarn --deploy-mode client /usr/lib/hudi/hudi-utilities-bundle.jar 
   --table-type MERGE_ON_READ 
   --source-ordering-field timestamp 
   --source-class org.apache.hudi.utilities.sources.ParquetDFSSource 
   --target-base-path s3://mysql-data-replication/HATtrick-sf10/MERGE_ON_READ_hudi_HAT/FRESHNESS1 
   --target-table FRESHNESS1 
   --transformer-class org.apache.hudi.utilities.transform.AWSDmsTransformer 
   --continuous 
   --hoodie-conf hoodie.datasource.write.recordkey.field=F_CLIENTNUM 
   --hoodie-conf hoodie.datasource.write.partitionpath.field="" 
   --hoodie-conf hoodie.deltastreamer.source.dfs.root=s3://mysql-data-replication/HATtrick-sf10/HAT/FRESHNESS1 
   --payload-class org.apache.hudi.payload.AWSDmsAvroPayload 
   --hoodie-conf hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.NonpartitionedKeyGenerator 
   --hoodie-conf hoodie.datasource.hive_sync.table=FRESHNESS1 
   --hoodie-conf hoodie.datasource.hive_sync.partition_extractor_class=org.apache.hudi.hive.NonPartitionedExtractor 
   --hoodie-conf hoodie.datasource.hive_sync.use_jdbc=true 
   --hoodie-conf hoodie.datasource.hive_sync.jdbcurl=jdbc:hive2://localhost:10000 
   --hoodie-conf hoodie.datasource.hive_sync.enable=true 
   --hoodie-conf hoodie.datasource.hive_sync.database=HAT 
   --enable-hive-sync 
   ```
   2. Query the _rt table when a change is captured by DeltaStreamer through Hive: 
   `select count(*) from database.table_rt;`
   
   **Expected behavior**
   
   A clear and concise description of what you expected to happen.
   
   **Environment Description**
   
   * Hudi version : 0.9.0
   
   * Spark version : 3.1.2
   
   * Hive version : 3.1.2
   
   * Hadoop version : 3.2.1-amzn-5
   
   * Storage (HDFS/S3/GCS..) : S3
   
   * Running on Docker? (yes/no) : no
   
   
   **Additional context**
   
   I tried to include hudi-utilities-bundle.jar directly to `hive.aux.jars.path`, but no luck. 
   
   **Stacktrace**
   
   ```
   ], TaskAttempt 3 failed, info=[Error: Error while running task ( failure ) : attempt_1663835389926_0022_1_00_000000_3:java.lang.RuntimeException: java.lang.RuntimeException: java.io.IOException: org.apache.hudi.exception.HoodieException: Exception when reading log file 
   	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:302)
   	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:253)
   	at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
   	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
   	at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
   	at java.security.AccessController.doPrivileged(Native Method)
   	at javax.security.auth.Subject.doAs(Subject.java:422)
   	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
   	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
   	at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
   	at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
   	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at java.lang.Thread.run(Thread.java:750)
   Caused by: java.lang.RuntimeException: java.io.IOException: org.apache.hudi.exception.HoodieException: Exception when reading log file 
   	at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:206)
   	at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.<init>(TezGroupedSplitsInputFormat.java:145)
   	at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat.getRecordReader(TezGroupedSplitsInputFormat.java:111)
   	at org.apache.tez.mapreduce.lib.MRReaderMapred.setupOldRecordReader(MRReaderMapred.java:157)
   	at org.apache.tez.mapreduce.lib.MRReaderMapred.setSplit(MRReaderMapred.java:83)
   	at org.apache.tez.mapreduce.input.MRInput.initFromEventInternal(MRInput.java:703)
   	at org.apache.tez.mapreduce.input.MRInput.initFromEvent(MRInput.java:662)
   	at org.apache.tez.mapreduce.input.MRInputLegacy.checkAndAwaitRecordReaderInitialization(MRInputLegacy.java:150)
   	at org.apache.tez.mapreduce.input.MRInputLegacy.init(MRInputLegacy.java:114)
   	at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.getMRInput(MapRecordProcessor.java:525)
   	at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:171)
   	at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:270)
   	... 14 more
   Caused by: java.io.IOException: org.apache.hudi.exception.HoodieException: Exception when reading log file 
   	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
   	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
   	at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:421)
   	at org.apache.hadoop.mapred.split.TezGroupedSplitsInputFormat$TezGroupedSplitsRecordReader.initNextRecordReader(TezGroupedSplitsInputFormat.java:203)
   	... 25 more
   Caused by: org.apache.hudi.exception.HoodieException: Exception when reading log file 
   	at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scan(AbstractHoodieLogRecordScanner.java:285)
   	at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.performScan(HoodieMergedLogRecordScanner.java:99)
   	at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:92)
   	at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner$Builder.build(HoodieMergedLogRecordScanner.java:267)
   	at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.getMergedLogRecordScanner(RealtimeCompactedRecordReader.java:92)
   	at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.<init>(RealtimeCompactedRecordReader.java:63)
   	at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.constructRecordReader(HoodieRealtimeRecordReader.java:70)
   	at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.<init>(HoodieRealtimeRecordReader.java:47)
   	at org.apache.hudi.hadoop.realtime.HoodieParquetRealtimeInputFormat.getRecordReader(HoodieParquetRealtimeInputFormat.java:123)
   	at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:418)
   	... 26 more
   Caused by: org.apache.hudi.exception.HoodieException: Unable to load class
   	at org.apache.hudi.common.util.ReflectionUtils.getClass(ReflectionUtils.java:57)
   	at org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:78)
   	at org.apache.hudi.common.util.SpillableMapUtils.convertToHoodieRecordPayload(SpillableMapUtils.java:133)
   	at org.apache.hudi.common.util.SpillableMapUtils.convertToHoodieRecordPayload(SpillableMapUtils.java:118)
   	at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.createHoodieRecord(AbstractHoodieLogRecordScanner.java:322)
   	at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processDataBlock(AbstractHoodieLogRecordScanner.java:316)
   	at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:352)
   	at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scan(AbstractHoodieLogRecordScanner.java:276)
   	... 35 more
   Caused by: java.lang.ClassNotFoundException: org.apache.hudi.payload.AWSDmsAvroPayload
   	at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
   	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
   	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
   	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
   	at java.lang.Class.forName0(Native Method)
   	at java.lang.Class.forName(Class.java:264)
   	at org.apache.hudi.common.util.ReflectionUtils.getClass(ReflectionUtils.java:54)
   	... 42 more
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] nsivabalan commented on issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Posted by GitBox <gi...@apache.org>.
nsivabalan commented on issue #6747:
URL: https://github.com/apache/hudi/issues/6747#issuecomment-1287489380

   going ahead and closing this one. feel free to reopen or create a new one. would be happy to help. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] nsivabalan closed issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Posted by GitBox <gi...@apache.org>.
nsivabalan closed issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR
URL: https://github.com/apache/hudi/issues/6747


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] rahil-c commented on issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Posted by GitBox <gi...@apache.org>.
rahil-c commented on issue #6747:
URL: https://github.com/apache/hudi/issues/6747#issuecomment-1284641124

   The `/usr/lib/hudi/hudi-utilities-bundle.jar` is needed, And the `spark-avro` jar is required if your running any hudi release under 0.11.0 (if you are on 0.11.0 or above you can omit this jar). Finally you can get rid of spark-bundle since delta streamer is from utilities. 
   
   Here is a basic example 
   ```
   
   spark-submit --class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer \
               --jars /usr/lib/spark/external/lib/spark-avro.jar \
               --master yarn \
               --deploy-mode client \
               /usr/lib/hudi/hudi-utilities-bundle.jar \
               ...
               ... rest of params
               ...
             
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] yihua commented on issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Posted by GitBox <gi...@apache.org>.
yihua commented on issue #6747:
URL: https://github.com/apache/hudi/issues/6747#issuecomment-1255434329

   @Xiaohan-Shen If you are using Hudi jars from EMR like `--jars /usr/lib/spark/external/lib/spark-avro.jar,/usr/lib/hudi/hudi-spark-bundle.jar,/usr/lib/hudi/hudi-utilities-bundle.jar`, you don't need to put in OSS packages with `--packages org.apache.hudi:hudi-spark-bundle_2.12:0.9.0,org.apache.spark:spark-avro_2.12:3.0.1 `.
   
   @rahil-c @umehrot2 could you folks help here regarding the EMR setup?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] nsivabalan commented on issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Posted by GitBox <gi...@apache.org>.
nsivabalan commented on issue #6747:
URL: https://github.com/apache/hudi/issues/6747#issuecomment-1284335066

   and w/ hudi 0.12, you don't need spark-avro at all. so you can ignore that as well. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] nsivabalan commented on issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Posted by GitBox <gi...@apache.org>.
nsivabalan commented on issue #6747:
URL: https://github.com/apache/hudi/issues/6747#issuecomment-1284333773

   here is my inference so far. 
   
   to run deltastreamer, you just need hudi-utilities-bundle. spark-bundle in addition is not required. 
   
   ```
   --jars /usr/lib/spark/external/lib/spark-avro.jar,/usr/lib/hudi/hudi-spark-bundle.jar,/usr/lib/hudi/hudi-utilities-bundle.jar 
   --packages org.apache.hudi:hudi-spark-bundle_2.12:0.9.0,org.apache.spark:spark-avro_2.12:3.0.1 
   ```
   
   From your command, looks like you are adding spark bundle via --packages and both spark and utilties. can you remove pulling in spark bundles. and wrt utilities, try to pull in just once. also, you don't even need to pull in via --packages. bcoz, spark-submit expects an app jar anyways. 
   
   ```
   spark-submit
   .
   .
   --class CLASSNAME
   JAR_LOCATION
   param1
   param2
   .
   .
   ``
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] Xiaohan-Shen commented on issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Posted by GitBox <gi...@apache.org>.
Xiaohan-Shen commented on issue #6747:
URL: https://github.com/apache/hudi/issues/6747#issuecomment-1255438456

   @yihua Thanks for the reply! Not sure if this is relevant to the issue here, but I tried to run without `—packages`, and there was some sort of ClassNotFound exception when running DeltaStreamer. I am also confused of why, but if I run it with these `—packages`, there is no exception running deltastreamer.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] Xiaohan-Shen commented on issue #6747: [SUPPORT] AWSDmsAvroPayload not found querying _rt table MoR

Posted by GitBox <gi...@apache.org>.
Xiaohan-Shen commented on issue #6747:
URL: https://github.com/apache/hudi/issues/6747#issuecomment-1257091169

   @yihua @rahil-c @umehrot2 Sorry to interrupt! Any suggestion on how I can work around this ClassNotFound issue? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org