You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "sivabalan narayanan (Jira)" <ji...@apache.org> on 2021/01/26 16:12:00 UTC

[jira] [Resolved] (HUDI-1535) Hudi spark datasource fails w/ NoClassDefFoundError: org/apache/hudi/client/common/HoodieEngineContext

     [ https://issues.apache.org/jira/browse/HUDI-1535?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

sivabalan narayanan resolved HUDI-1535.
---------------------------------------
    Fix Version/s: 0.7.0
         Assignee: sivabalan narayanan
       Resolution: Fixed

> Hudi spark datasource fails w/ NoClassDefFoundError: org/apache/hudi/client/common/HoodieEngineContext
> ------------------------------------------------------------------------------------------------------
>
>                 Key: HUDI-1535
>                 URL: https://issues.apache.org/jira/browse/HUDI-1535
>             Project: Apache Hudi
>          Issue Type: Bug
>          Components: Spark Integration
>    Affects Versions: 0.7.0
>            Reporter: sivabalan narayanan
>            Assignee: sivabalan narayanan
>            Priority: Major
>              Labels: pull-request-available, release-blocker
>             Fix For: 0.7.0
>
>
> I tried Quick Start Guide w/ latest master.
>  
> {code:java}
> // first insert
> scala> df.write.format("hudi").
>      |   options(getQuickstartWriteConfigs).
>      |   option(PRECOMBINE_FIELD_OPT_KEY, "ts").
>      |   option(RECORDKEY_FIELD_OPT_KEY, "uuid").
>      |   option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
>      |   option(TABLE_NAME, tableName).
>      |   mode(Overwrite).
>      |   save(basePath)
> java.lang.NoClassDefFoundError: org/apache/hudi/client/common/HoodieEngineContext
>   at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:120)
>   at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:134)
>   at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
>   at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>   at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>   at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
>   at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>   at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>   at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>   at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>   at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>   at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>   at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
>   at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>   at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>   at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
>   at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
>   at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
>   at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
>   at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
>   at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
>   at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
>   ... 68 elided
> Caused by: java.lang.ClassNotFoundException: org.apache.hudi.client.common.HoodieEngineContext
>   at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>   at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>   ... 91 more
> {code}
> Command used for spark-submit:
> {code:java}
> ./bin/spark-shell   --packages org.apache.spark:spark-avro_2.11:2.4.4   --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer'   --jars hudi-utilities-bundle_2.11-0.7.0-SNAPSHOT.jar
> {code}
> Steps to repro:
> Just follow the quick start guide w/ above command for spark submit. 
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)