You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2019/10/15 01:30:25 UTC

[GitHub] [incubator-hudi] umehrot2 opened a new pull request #957: [HUDI-268] Provide mechanism to shade and relocate Avro dependency in hadoop-mr-bundle

umehrot2 opened a new pull request #957: [HUDI-268] Provide mechanism to shade and relocate Avro dependency in hadoop-mr-bundle
URL: https://github.com/apache/incubator-hudi/pull/957
 
 
   The earlier PR https://github.com/apache/incubator-hudi/pull/915 was closed by mistake as the branch got deleted. Thus creating a new PR for the same. Also addressed the comments by @bvaradar on the previous PR.
   
   **Why is this change required ?**
   
   As of now Hudi depends on Parquet 1.8.1 and Avro 1.7.7 which might work fine for older versions of Spark and Hive.
   
   But when we build it with Spark 2.4.3 which uses Parquet 1.10.1 and Avro 1.8.2 using:
   
   ```
   mvn clean install -DskipTests -DskipITs -Dhadoop.version=2.8.5 -Dspark.version=2.4.3 -Dhbase.version=1.4.10 -Dhive.version=2.3.5 -Dparquet.version=1.10.1 -Davro.version=1.8.2
   ```
   
   We run into runtime issue on Hive 2.3.5 when querying RT tables:
   
   ```
   hive> select record_key from mytable_mor_sep20_01_rt limit 10;
   OK
   Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/LogicalType
   	at org.apache.hudi.hadoop.realtime.AbstractRealtimeRecordReader.init(AbstractRealtimeRecordReader.java:323)
   	at org.apache.hudi.hadoop.realtime.AbstractRealtimeRecordReader.<init>(AbstractRealtimeRecordReader.java:105)
   	at org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.<init>(RealtimeCompactedRecordReader.java:48)
   	at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.constructRecordReader(HoodieRealtimeRecordReader.java:67)
   	at org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.<init>(HoodieRealtimeRecordReader.java:45)
   	at org.apache.hudi.hadoop.realtime.HoodieParquetRealtimeInputFormat.getRecordReader(HoodieParquetRealtimeInputFormat.java:234)
   	at org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputFormatSplit.getRecordReader(FetchOperator.java:695)
   	at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:333)
   	at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:459)
   	at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:428)
   	at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:147)
   	at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:2208)
   	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:253)
   	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:184)
   	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
   	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
   	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
   	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
   	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   	at java.lang.reflect.Method.invoke(Method.java:498)
   	at org.apache.hadoop.util.RunJar.run(RunJar.java:239)
   	at org.apache.hadoop.util.RunJar.main(RunJar.java:153)
   Caused by: java.lang.ClassNotFoundException: org.apache.avro.LogicalType
   	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
   	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
   	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
   	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
   ```
   This is happening because we are shading parquet-avro which is now 1.10.1. And it requires Avro 1.8.2 which has this LogicalType class. However, Hive 2.3.5 has Avro 1.7.7 available at runtime which does not have LogicalType class.
   
   Thus, we are providing a way to shade Avro if needed through maven profile.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services