You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2020/10/10 15:15:58 UTC
[GitHub] [hudi] tandonraghav opened a new issue #2165: [SUPPORT] Exception while Querying Hive _rt table
tandonraghav opened a new issue #2165:
URL: https://github.com/apache/hudi/issues/2165
**Describe the problem you faced**
I am using Spark DF to persist Hudi Table and Hive sync is enabled. But when i query *_ro table all works fine but *_rt table is not working and giving exception.
- I am using custom class to do `preCombine` and combineAndUpdateValue` , so I have included my jar file in ${Hive}/lib folder
- Also, tried to set conf in a Hive session `set hive.input.format=org.apache.hadoop.hive.ql.io.HiveInputFormat;` and `set hive.fetch.task.conversion=none;`
Hive - 2.3.7
Spark - 2
hudi-hadoop-mr-bundle-0.6.0.jar
Hudi - 0.6.0
Actual Exception -> **Caused by: java.lang.ClassCastException: org.apache.hudi.org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.generic.GenericRecord**
````
CREATE EXTERNAL TABLE `bhuvan_123_ro`(
`_hoodie_commit_time` string,
`_hoodie_commit_seqno` string,
`_hoodie_record_key` string,
`_hoodie_partition_path` string,
`_hoodie_file_name` string,
`ts_ms` bigint,
`pincode` double,
`image_link` string,
`_id` string,
`op` string,
`a` string,
`b` string,
`c` string,
`d` string,
`e` double)
PARTITIONED BY (
`db_name` string)
ROW FORMAT SERDE
'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
STORED AS INPUTFORMAT
'org.apache.hudi.hadoop.HoodieParquetInputFormat'
OUTPUTFORMAT
'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION
'file:/tmp/test/hudi-user-data/MOE_PRODUCT_INFO.bhuvan_123'
TBLPROPERTIES (
'last_commit_time_sync'='20201010202918',
'transient_lastDdlTime'='1602341935')
Time taken: 0.192 seconds, Fetched: 29 row(s)
````
Exception-
````
org.apache.hudi.exception.HoodieException: Unable to instantiate payload class
at org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:78) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.common.util.SpillableMapUtils.convertToHoodieRecordPayload(SpillableMapUtils.java:116) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processDataBlock(AbstractHoodieLogRecordScanner.java:277) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:306) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scan(AbstractHoodieLogRecordScanner.java:239) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:81) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.getMergedLogRecordScanner(RealtimeCompactedRecordReader.java:76) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.<init>(RealtimeCompactedRecordReader.java:55) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.constructRecordReader(HoodieRealtimeRecordReader.java:70) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.<init>(HoodieRealtimeRecordReader.java:47) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hudi.hadoop.realtime.HoodieParquetRealtimeInputFormat.getRecordReader(HoodieParquetRealtimeInputFormat.java:186) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
at org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:376) ~[hive-exec-2.3.7.jar:2.3.7]
at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169) ~[hadoop-mapreduce-client-core-2.10.0.jar:?]
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:438) ~[hadoop-mapreduce-client-core-2.10.0.jar:?]
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) ~[hadoop-mapreduce-client-core-2.10.0.jar:?]
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:270) ~[hadoop-mapreduce-client-common-2.10.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_222]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_222]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_222]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_222]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_222]
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_222]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_222]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_222]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_222]
at org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:76) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
... 20 more
Caused by: java.lang.ClassCastException: org.apache.hudi.org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.generic.GenericRecord
at com.moengage.dpm.jobs.MergeHudiPayload.<init>(MergeHudiPayload.java:41) ~[dpm-feed-spark-jobs-1.0.10-rc0.jar:?]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_222]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_222]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_222]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_222]
at org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:76) ~[hudi-hadoop-mr-bundle-0.6.0.jar:0.6.0]
````
Line Where Exception is thrown-
````
public MergeHudiPayload(Option<GenericRecord> record) {
this(record.isPresent() ? record.get() : null, (record1) -> 0); // natural order
}
````
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] tandonraghav edited a comment on issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
tandonraghav edited a comment on issue #2165:
URL: https://github.com/apache/hudi/issues/2165#issuecomment-707163257
Attaching the presto logs-
````
2020-10-12T14:41:49.229Z INFO 20201012_144143_00011_zymbu.1.0.0-0-44 org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner Merging the final data blocks
2020-10-12T14:41:49.229Z INFO 20201012_144143_00011_zymbu.1.0.0-0-44 org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner Number of remaining logblocks to merge 1
2020-10-12T14:41:49.283Z ERROR 20201012_144143_00011_zymbu.1.0.0-0-44 org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner Got exception when reading log file
org.apache.hudi.exception.HoodieException: Unable to instantiate payload class
at org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:69)
at org.apache.hudi.common.util.SpillableMapUtils.convertToHoodieRecordPayload(SpillableMapUtils.java:116)
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processAvroDataBlock(AbstractHoodieLogRecordScanner.java:276)
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:305)
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scan(AbstractHoodieLogRecordScanner.java:238)
at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:81)
at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.getMergedLogRecordScanner(RealtimeCompactedRecordReader.java:69)
at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.<init>(RealtimeCompactedRecordReader.java:52)
at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.constructRecordReader(HoodieRealtimeRecordReader.java:69)
at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.<init>(HoodieRealtimeRecordReader.java:47)
at org.apache.hudi.hadoop.realtime.HoodieParquetRealtimeInputFormat.getRecordReader(HoodieParquetRealtimeInputFormat.java:253)
at com.facebook.presto.hive.HiveUtil.createRecordReader(HiveUtil.java:251)
at com.facebook.presto.hive.GenericHiveRecordCursorProvider.lambda$createRecordCursor$0(GenericHiveRecordCursorProvider.java:74)
at com.facebook.presto.hive.authentication.UserGroupInformationUtils.lambda$executeActionInDoAs$0(UserGroupInformationUtils.java:29)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1824)
at com.facebook.presto.hive.authentication.UserGroupInformationUtils.executeActionInDoAs(UserGroupInformationUtils.java:27)
at com.facebook.presto.hive.authentication.ImpersonatingHdfsAuthentication.doAs(ImpersonatingHdfsAuthentication.java:39)
at com.facebook.presto.hive.HdfsEnvironment.doAs(HdfsEnvironment.java:82)
at com.facebook.presto.hive.GenericHiveRecordCursorProvider.createRecordCursor(GenericHiveRecordCursorProvider.java:73)
at com.facebook.presto.hive.HivePageSourceProvider.createHivePageSource(HivePageSourceProvider.java:370)
at com.facebook.presto.hive.HivePageSourceProvider.createPageSource(HivePageSourceProvider.java:137)
at com.facebook.presto.hive.HivePageSourceProvider.createPageSource(HivePageSourceProvider.java:113)
at com.facebook.presto.spi.connector.classloader.ClassLoaderSafeConnectorPageSourceProvider.createPageSource(ClassLoaderSafeConnectorPageSourceProvider.java:52)
at com.facebook.presto.split.PageSourceManager.createPageSource(PageSourceManager.java:69)
at com.facebook.presto.operator.TableScanOperator.getOutput(TableScanOperator.java:259)
at com.facebook.presto.operator.Driver.processInternal(Driver.java:379)
at com.facebook.presto.operator.Driver.lambda$processFor$8(Driver.java:283)
at com.facebook.presto.operator.Driver.tryWithLock(Driver.java:675)
at com.facebook.presto.operator.Driver.processFor(Driver.java:276)
at com.facebook.presto.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1077)
at com.facebook.presto.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:162)
at com.facebook.presto.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:545)
at com.facebook.presto.$gen.Presto_0_232____20201012_144123_1.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:67)
... 37 more
Caused by: java.lang.ClassCastException: org.apache.hudi.org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.generic.GenericRecord
at xxx.MergeHudiPayload.<init>(MergeHudiPayload.java:41)
... 42 more
````
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] tandonraghav commented on issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
tandonraghav commented on issue #2165:
URL: https://github.com/apache/hudi/issues/2165#issuecomment-707077681
@bvaradar There is a clear issue between hudi-hadoop-mr-bundle jar and hudi-spark-bundle_2.11.jar
Can you please check and clarify once. I dont think it is related to any classpath issue.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] tandonraghav closed issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
tandonraghav closed issue #2165:
URL: https://github.com/apache/hudi/issues/2165
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] tandonraghav commented on issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
tandonraghav commented on issue #2165:
URL: https://github.com/apache/hudi/issues/2165#issuecomment-707749644
@bvaradar Thanks for the help. I am able to resolve it by putting the shaded jar.
I feel it should be documented well.
https://docs.aws.amazon.com/emr/latest/ReleaseGuide/emr-hudi-considerations.html & https://hudi.apache.org/docs/querying_data.html
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] bvaradar commented on issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
bvaradar commented on issue #2165:
URL: https://github.com/apache/hudi/issues/2165#issuecomment-706999132
It looks like there are more than 1 fat bundles in the class path (hudi-hadoop-mr-bundle) and hudi-spark-bundle ?
If this is the case, You need to just use hudi-spark-bundle.
Also, try passing your custom jar with --jars option ?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] tandonraghav commented on issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
tandonraghav commented on issue #2165:
URL: https://github.com/apache/hudi/issues/2165#issuecomment-707231701
@bvaradar I was trying on Presto with Glue on AWS EMR. presto-bundle is present inside <presto-inst>/plugins/hive-hadoop2/.
But my problem is why this error - `Caused by: java.lang.ClassCastException: org.apache.hudi.org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.generic.GenericRecord`
Why there is a difference in Generic Record class (HudiRecordPayload.class) in spark_bundle and presto/hudi-hadoop-mr-bundle?
I am also aware that specific versions of presto only supports Snapshot queries. But as the Stacktrace says it is not able to cast properly.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] bvaradar commented on issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
bvaradar commented on issue #2165:
URL: https://github.com/apache/hudi/issues/2165#issuecomment-707475775
@tandonraghav : Yes, you need to shade the jar containing the custom record payload. Here is some context http://hudi.apache.org/releases.html#release-highlights-1
Look for section starting with...
```
With 0.5.1, hudi-hadoop-mr-bundle which is used by query engines such as presto and hive includes shaded avro package to support hudi real time queries through these
```
More Context: https://issues.apache.org/jira/browse/HUDI-519
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] tandonraghav commented on issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
tandonraghav commented on issue #2165:
URL: https://github.com/apache/hudi/issues/2165#issuecomment-707163257
Attaching the presto logs-
````
2020-10-12T14:41:49.229Z INFO 20201012_144143_00011_zymbu.1.0.0-0-44 org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner Merging the final data blocks
2020-10-12T14:41:49.229Z INFO 20201012_144143_00011_zymbu.1.0.0-0-44 org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner Number of remaining logblocks to merge 1
2020-10-12T14:41:49.283Z ERROR 20201012_144143_00011_zymbu.1.0.0-0-44 org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner Got exception when reading log file
org.apache.hudi.exception.HoodieException: Unable to instantiate payload class
at org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:69)
at org.apache.hudi.common.util.SpillableMapUtils.convertToHoodieRecordPayload(SpillableMapUtils.java:116)
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processAvroDataBlock(AbstractHoodieLogRecordScanner.java:276)
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:305)
at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scan(AbstractHoodieLogRecordScanner.java:238)
at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:81)
at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.getMergedLogRecordScanner(RealtimeCompactedRecordReader.java:69)
at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.<init>(RealtimeCompactedRecordReader.java:52)
at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.constructRecordReader(HoodieRealtimeRecordReader.java:69)
at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.<init>(HoodieRealtimeRecordReader.java:47)
at org.apache.hudi.hadoop.realtime.HoodieParquetRealtimeInputFormat.getRecordReader(HoodieParquetRealtimeInputFormat.java:253)
at com.facebook.presto.hive.HiveUtil.createRecordReader(HiveUtil.java:251)
at com.facebook.presto.hive.GenericHiveRecordCursorProvider.lambda$createRecordCursor$0(GenericHiveRecordCursorProvider.java:74)
at com.facebook.presto.hive.authentication.UserGroupInformationUtils.lambda$executeActionInDoAs$0(UserGroupInformationUtils.java:29)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:360)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1824)
at com.facebook.presto.hive.authentication.UserGroupInformationUtils.executeActionInDoAs(UserGroupInformationUtils.java:27)
at com.facebook.presto.hive.authentication.ImpersonatingHdfsAuthentication.doAs(ImpersonatingHdfsAuthentication.java:39)
at com.facebook.presto.hive.HdfsEnvironment.doAs(HdfsEnvironment.java:82)
at com.facebook.presto.hive.GenericHiveRecordCursorProvider.createRecordCursor(GenericHiveRecordCursorProvider.java:73)
at com.facebook.presto.hive.HivePageSourceProvider.createHivePageSource(HivePageSourceProvider.java:370)
at com.facebook.presto.hive.HivePageSourceProvider.createPageSource(HivePageSourceProvider.java:137)
at com.facebook.presto.hive.HivePageSourceProvider.createPageSource(HivePageSourceProvider.java:113)
at com.facebook.presto.spi.connector.classloader.ClassLoaderSafeConnectorPageSourceProvider.createPageSource(ClassLoaderSafeConnectorPageSourceProvider.java:52)
at com.facebook.presto.split.PageSourceManager.createPageSource(PageSourceManager.java:69)
at com.facebook.presto.operator.TableScanOperator.getOutput(TableScanOperator.java:259)
at com.facebook.presto.operator.Driver.processInternal(Driver.java:379)
at com.facebook.presto.operator.Driver.lambda$processFor$8(Driver.java:283)
at com.facebook.presto.operator.Driver.tryWithLock(Driver.java:675)
at com.facebook.presto.operator.Driver.processFor(Driver.java:276)
at com.facebook.presto.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1077)
at com.facebook.presto.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:162)
at com.facebook.presto.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:545)
at com.facebook.presto.$gen.Presto_0_232____20201012_144123_1.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:67)
... 37 more
Caused by: java.lang.ClassCastException: org.apache.hudi.org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.generic.GenericRecord
at com.moengage.dpm.jobs.MergeHudiPayload.<init>(MergeHudiPayload.java:41)
... 42 more
````
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
[GitHub] [hudi] bvaradar commented on issue #2165: [SUPPORT] Exception while Querying Hive _rt table
Posted by GitBox <gi...@apache.org>.
bvaradar commented on issue #2165:
URL: https://github.com/apache/hudi/issues/2165#issuecomment-707226897
@tandonraghav : It was not clear from your original description of the issue whether you are making a spark or presto query. Looking at the previous comments, it looks like you are making Presto queries ? Have you included presto-bundle which is the only bundle you should have in the runtime for presto ?
@bhasudha : Any other things that we need to be aware of ?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org