You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hudi.apache.org by Lian Jiang <ji...@gmail.com> on 2020/05/18 20:11:50 UTC

hudi dependency conflicts for test

Hi,

I am using hudi in a scala gradle project:

dependencies {
    compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.4.4'
    compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.4.4'
    compile group: 'org.scala-lang', name: 'scala-library', version: '2.11.11'
    compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
    compile group: 'org.apache.spark', name: 'spark-avro_2.11', version: '2.4.4'
    compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
    compile group: 'com.zillow.datacontracts', name:
'contract-evaluation-library', version: '0.1.0.master.98a438b'
    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
version: '0.5.1-incubating') {
        exclude group: 'org.scala-lang', module: 'scala-library'
        exclude group: 'org.scalatest', module: 'scalatest_2.12'
    }

    testCompile group: 'junit', name: 'junit', version: '4.12'
    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
version: '3.2.0-SNAP7'
    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
version: '1.5.12'
}

Below code throws exception '
java.lang.NoSuchMethodError:
org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'

import org.junit.runner.RunWith
import org.scalatest.FunSuite
import org.scalatest.junit.JUnitRunner
import org.scalatest.mockito.MockitoSugar

@RunWith(classOf[JUnitRunner])
class BaseTest extends FunSuite with MockitoSugar {
}

Removing org.apache.hudi from the dependency list will make the code
work. Does anybody know how to include hudi dependency without
conflicting with the test?

Appreciate any help!

Regards

Leon

Re: hudi dependency conflicts for test

Posted by Shiyan Xu <xu...@gmail.com>.
That was a close one. :)

On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <vi...@apache.org> wrote:

> Wow.. Race condition :) ..
>
> Thanks for racing , Raymond!
>
> On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <xu...@gmail.com>
> wrote:
>
> > Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in your
> > classpath.
> >
> >
> >
> > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com>
> wrote:
> >
> > > Thanks Balaji.
> > >
> > > My unit test failed due to dependency incompatibility. Any idea will be
> > > highly appreciated!
> > >
> > >
> > > The test is copied from hudi quick start:
> > >
> > > import org.apache.hudi.QuickstartUtils._
> > >
> > > import scala.collection.JavaConversions._
> > > import org.apache.spark.sql.SaveMode._
> > > import org.apache.hudi.DataSourceReadOptions._
> > > import org.apache.hudi.DataSourceWriteOptions._
> > > import org.apache.hudi.config.HoodieWriteConfig._
> > >
> > > class InputOutputTest extends HudiBaseTest{
> > >
> > > val config = new SparkConf().setAppName(name)
> > >   config.set("spark.driver.allowMultipleContexts", "true")
> > >   config.set("spark.serializer",
> > > "org.apache.spark.serializer.KryoSerializer")
> > >   config.setMaster("local[*]").setAppName("Local Test")
> > >   val executionContext =
> > > SparkSession.builder().config(config).getOrCreate()
> > >
> > > val tableName = "hudi_trips_cow"
> > >   val basePath = "file:///tmp/hudi_trips_cow"
> > >   val dataGen = new DataGenerator
> > >
> > >   override def beforeAll(): Unit = {
> > >   }
> > >
> > >   test("Can create a hudi dataset") {
> > >     val inserts = convertToStringList(dataGen.generateInserts(10))
> > >     val df = executionContext.sparkSession.read.json(
> > >       executionContext.sparkContext.parallelize(inserts, 2))
> > >
> > >     df.write.format("hudi").
> > >       options(getQuickstartWriteConfigs).
> > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> > >       option(TABLE_NAME, tableName).
> > >       mode(Overwrite).
> > >       save(basePath)
> > >   }
> > > }
> > >
> > >
> > > The exception is:
> > >
> > > java.lang.NoClassDefFoundError:
> > org/apache/spark/sql/avro/SchemaConverters$
> > >         at
> > >
> >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > >         at
> > >
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > >         at
> > > org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > >         at
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > >         at
> > > org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > >         at
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >         at
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > >         at
> > >
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > >         at
> > >
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > >         at
> > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > >         at
> > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > >         at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > >         at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >         at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >         at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > >         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > >         at org.scalatest.Transformer.apply(Transformer.scala:22)
> > >         at org.scalatest.Transformer.apply(Transformer.scala:20)
> > >         at
> > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > >         at
> org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > >         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > >         at
> > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >         at
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > >         at
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > >         at scala.collection.immutable.List.foreach(List.scala:392)
> > >         at
> org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > >         at org.scalatest.SuperEngine.org
> > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > >         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > >         at
> > > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > >         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> > >         at org.scalatest.FunSuite.org
> > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > >         at
> > > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >         at
> > > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > >         at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > >         at
> > >
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > >         at
> > > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > >         at
> > > com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > >         at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > >         at
> > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > >         at
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >         at
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > >         at
> > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > >         at
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >         at
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >         at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > >         at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > >         at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > >         at
> > >
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > >         at
> > >
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > >         at
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > >         at
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > >         at
> > >
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > >         at java.base/java.lang.Thread.run(Thread.java:835)
> > > Caused by: java.lang.ClassNotFoundException:
> > > org.apache.spark.sql.avro.SchemaConverters$
> > >         at
> > >
> >
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> > >         at
> > >
> >
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> > >         at
> > java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> > >         ... 91 more
> > >
> > >
> > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> > > <v....@ymail.com.invalid> wrote:
> > >
> > > >  Thanks for using Hudi. Looking at pom definitions between 0.5.1 and
> > > > 0.5.2, I don't see any difference that could cause this issue. As it
> > > works
> > > > with 0.5.2, I am assuming you are not blocked. Let us know otherwise.
> > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian Jiang <
> > > > jiangok2006@gmail.com> wrote:
> > > >
> > > >  Thanks Vinoth.
> > > >
> > > > Below dependency has no conflict:
> > > >
> > > > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > > > '2.3.0'
> > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > > '2.3.0'
> > > > compile group: 'org.scala-lang', name: 'scala-library', version:
> > > '2.11.11'
> > > > compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> '3.7.1'
> > > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > '1.11.297'
> > > > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > > > version: '0.5.2-incubating'
> > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> > > > '3.2.0-SNAP7'
> > > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> version:
> > > > '1.5.12'
> > > > compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
> > > > '0.8.0-incubating'
> > > >
> > > > Cheers!
> > > >
> > > >
> > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org>
> > > wrote:
> > > >
> > > > > Hi Leon,
> > > > >
> > > > > Sorry for the late reply.  Seems like a version mismatch for
> > mockito..
> > > > > I see you are already trying to exclude it though.. Could you share
> > the
> > > > > full stack trace?
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com>
> > > > wrote:
> > > > >
> > > > > > Hi,
> > > > > >
> > > > > > I am using hudi in a scala gradle project:
> > > > > >
> > > > > > dependencies {
> > > > > >    compile group: 'org.apache.spark', name: 'spark-core_2.11',
> > > version:
> > > > > > '2.4.4'
> > > > > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> > > version:
> > > > > > '2.4.4'
> > > > > >    compile group: 'org.scala-lang', name: 'scala-library',
> version:
> > > > > > '2.11.11'
> > > > > >    compile group: 'com.github.scopt', name: 'scopt_2.11',
> version:
> > > > > '3.7.1'
> > > > > >    compile group: 'org.apache.spark', name: 'spark-avro_2.11',
> > > version:
> > > > > > '2.4.4'
> > > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > > > > '1.11.297'
> > > > > >    compile group: 'com.zillow.datacontracts', name:
> > > > > > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> > > > > >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > > > > > version: '0.5.1-incubating') {
> > > > > >        exclude group: 'org.scala-lang', module: 'scala-library'
> > > > > >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> > > > > >    }
> > > > > >
> > > > > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > > >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > > > > > version: '3.2.0-SNAP7'
> > > > > >    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > > > > > version: '1.5.12'
> > > > > > }
> > > > > >
> > > > > > Below code throws exception '
> > > > > > java.lang.NoSuchMethodError:
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > > > >
> > > > > > import org.junit.runner.RunWith
> > > > > > import org.scalatest.FunSuite
> > > > > > import org.scalatest.junit.JUnitRunner
> > > > > > import org.scalatest.mockito.MockitoSugar
> > > > > >
> > > > > > @RunWith(classOf[JUnitRunner])
> > > > > > class BaseTest extends FunSuite with MockitoSugar {
> > > > > > }
> > > > > >
> > > > > > Removing org.apache.hudi from the dependency list will make the
> > code
> > > > > > work. Does anybody know how to include hudi dependency without
> > > > > > conflicting with the test?
> > > > > >
> > > > > > Appreciate any help!
> > > > > >
> > > > > > Regards
> > > > > >
> > > > > > Leon
> > > > > >
> > > > >
> > > >
> > > >
> > > > --
> > > >
> > > > Create your own email signature
> > > > <
> > > >
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > > >
> > > >
> > >
> > >
> > >
> > > --
> > >
> > > Create your own email signature
> > > <
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > >
> > >
> >
>

Re: hudi dependency conflicts for test

Posted by Vinoth Chandar <vi...@apache.org>.
Thanks Lian! Will work it in!

On Tue, May 26, 2020 at 9:02 AM Lian Jiang <ji...@gmail.com> wrote:

> I added a comment in this wiki. Hope this works. Thanks.
>
> On Sun, May 24, 2020 at 2:32 AM Vinoth Chandar <vi...@apache.org> wrote:
>
> > Great team work everyone!
> >
> > Anything worth documenting here?
> > https://cwiki.apache.org/confluence/display/HUDI/Troubleshooting+Guide
> >
> > On Thu, May 21, 2020 at 11:02 PM Lian Jiang <ji...@gmail.com>
> wrote:
> >
> > > The root cause is that I need to use java 8 instead of the default java
> > 11
> > > in intellij. Thanks everyone for helping and cheers!
> > >
> > > On Thu, May 21, 2020 at 1:09 PM Lian Jiang <ji...@gmail.com>
> > wrote:
> > >
> > > > The examples in quick start work for me in spark-shell. I am trying
> to
> > > use
> > > > scala unit test to make these examples easier to repeat in CICD given
> > > hudi
> > > > is still in incubating.
> > > >
> > > > Below is the new set of dependencies as instructed:
> > > >
> > > > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > > '2.4.5'
> > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > > '2.4.5'
> > > > compile group: 'org.apache.spark', name: 'spark-avro_2.11', version:
> > > '2.4.4'
> > > > compile group: 'org.scala-lang', name: 'scala-library', version:
> > > '2.11.11'
> > > > compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> '3.7.1'
> > > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > '1.11.297'
> > > > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > > version: '0.5.2-incubating'
> > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> > > '3.2.0-SNAP7'
> > > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> version:
> > > '1.5.12'
> > > >
> > > > Yet, there is another version related exception:
> > > >
> > > > java.lang.IllegalArgumentException: Unsupported class file major
> > version
> > > 56
> > > >       at
> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
> > > >       at
> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
> > > >       at
> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
> > > >       at
> org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
> > > >       at
> > >
> >
> org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
> > > >       at
> > >
> >
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
> > > >       at
> > >
> >
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
> > > >       at
> > >
> >
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
> > > >       at
> > >
> >
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> > > >       at
> > >
> >
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> > > >       at
> > >
> >
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
> > > >       at
> > scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
> > > >       at
> > > scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
> > > >       at
> > >
> >
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
> > > >       at
> > >
> >
> org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
> > > >       at
> > > org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
> > > >       at
> > > org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
> > > >       at
> org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
> > > >       at
> org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
> > > >       at
> > >
> >
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
> > > >       at
> > >
> >
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
> > > >       at scala.collection.immutable.List.foreach(List.scala:392)
> > > >       at
> > >
> >
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
> > > >       at
> > > org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
> > > >       at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
> > > >       at
> org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
> > > >       at
> org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1409)
> > > >       at
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > > >       at
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> > > >       at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
> > > >       at org.apache.spark.rdd.RDD.take(RDD.scala:1382)
> > > >       at
> > >
> org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply$mcZ$sp(RDD.scala:1517)
> > > >       at
> > > org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
> > > >       at
> > > org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
> > > >       at
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > > >       at
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> > > >       at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
> > > >       at org.apache.spark.rdd.RDD.isEmpty(RDD.scala:1516)
> > > >       at
> > >
> >
> org.apache.spark.api.java.JavaRDDLike$class.isEmpty(JavaRDDLike.scala:544)
> > > >       at
> > >
> >
> org.apache.spark.api.java.AbstractJavaRDDLike.isEmpty(JavaRDDLike.scala:45)
> > > >       at
> > >
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:142)
> > > >       at
> > > org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > > >       at
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > > >       at
> > > org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:83)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:81)
> > > >       at
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >       at
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
> > > >       at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
> > > >       at
> > >
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > > >       at
> > >
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > > >       at
> > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > > >       at
> > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > > >       at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > > >       at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >       at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >       at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > > >       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > > >       at org.scalatest.Transformer.apply(Transformer.scala:22)
> > > >       at org.scalatest.Transformer.apply(Transformer.scala:20)
> > > >       at
> > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > > >       at
> org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > > >       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > > >       at
> > >
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > > >       at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >       at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > > >       at
> > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > > >       at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > > >       at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >       at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >       at
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > > >       at
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > > >       at scala.collection.immutable.List.foreach(List.scala:392)
> > > >       at
> org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > > >       at org.scalatest.SuperEngine.org
> > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > > >       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > > >       at
> > > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > > >       at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > > >       at org.scalatest.Suite$class.run(Suite.scala:1147)
> > > >       at org.scalatest.FunSuite.org
> > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > > >       at
> > > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >       at
> > > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >       at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > > >       at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > > >       at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > > >       at
> > >
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > > >       at
> > > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > > >       at
> > > com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > > >       at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > > >       at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > > >       at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > > >       at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > > >       at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > > >       at
> > >
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > > >       at
> > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > > >       at
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >       at
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >       at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >       at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >       at
> > >
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > > >       at
> > >
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > > >       at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > > >       at
> > >
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > > >       at
> > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > > >       at
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >       at
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >       at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >       at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >       at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > > >       at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > > >       at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > > >       at
> > >
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > > >       at
> > >
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > > >       at
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > > >       at
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > > >       at
> > >
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > > >       at java.base/java.lang.Thread.run(Thread.java:835)
> > > >
> > > > Does anyone have a working example for unit test instead of
> > spark-shell?
> > > Thanks.
> > > >
> > > >
> > > >
> > > > On Thu, May 21, 2020 at 12:21 PM Lamber-Ken <la...@apache.org>
> > > wrote:
> > > >
> > > >> hello jiang,
> > > >>
> > > >> Please try following demo, need spark(>=2.4.4)
> > > >>
> > > >> ------------------------------------------------------
> > > >>
> > > >> export
> > SPARK_HOME=/work/BigData/install/spark/spark-2.4.5-bin-hadoop2.7
> > > >> ${SPARK_HOME}/bin/spark-shell \
> > > >>     --packages
> > > >>
> > >
> >
> org.apache.hudi:hudi-spark-bundle_2.11:0.5.2-incubating,org.apache.spark:spark-avro_2.11:2.4.4
> > > >> \
> > > >>     --conf
> > 'spark.serializer=org.apache.spark.serializer.KryoSerializer'
> > > >>
> > > >> import org.apache.spark.sql.functions._
> > > >>
> > > >> val tableName = "hudi_mor_table"
> > > >> val basePath = "file:///tmp/hudi_cow_tablen"
> > > >>
> > > >> val hudiOptions = Map[String,String](
> > > >>   "hoodie.upsert.shuffle.parallelism" -> "10",
> > > >>   "hoodie.datasource.write.recordkey.field" -> "key",
> > > >>   "hoodie.datasource.write.partitionpath.field" -> "dt",
> > > >>   "hoodie.table.name" -> tableName,
> > > >>   "hoodie.datasource.write.precombine.field" -> "timestamp"
> > > >> )
> > > >>
> > > >> val inputDF = spark.range(0, 5).
> > > >>    withColumn("key", $"id").
> > > >>    withColumn("data", lit("data")).
> > > >>    withColumn("timestamp", current_timestamp()).
> > > >>    withColumn("dt", date_format($"timestamp", "yyyy-MM-dd"))
> > > >>
> > > >> inputDF.write.format("org.apache.hudi").
> > > >>   options(hudiOptions).
> > > >>   mode("Overwrite").
> > > >>   save(basePath)
> > > >>
> > > >> spark.read.format("org.apache.hudi").load(basePath + "/*/*").show();
> > > >>
> > > >> ------------------------------------------------------
> > > >>
> > > >> Best,
> > > >> Lamber-Ken
> > > >>
> > > >>
> > > >> On 2020/05/21 18:59:02, Lian Jiang <ji...@gmail.com> wrote:
> > > >> > Thanks Shiyan and Vinoth. Unfortunately, adding
> > > >> > org.apache.spark:spark-avro_2.11:2.4.4 throws another version
> > related
> > > >> > exception:
> > > >> >
> > > >> > java.lang.NoSuchMethodError:
> > > >> >
> > > >>
> > >
> >
> org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
> > > >> >       at
> scala.collection.Iterator$class.foreach(Iterator.scala:891)
> > > >> >       at
> > > scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> > > >> >       at
> > > >> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> > > >> >       at
> > > >> org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > > >> >       at
> > > >> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > > >> >       at
> > > >>
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > > >> >       at
> > > >>
> > >
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > > >> >       at
> > > >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > > >> >       at
> > > >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > > >> >       at
> > > >>
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > > >> >       at
> > > >>
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >> >       at
> > > >>
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >> >       at
> org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > > >> >       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > > >> >       at org.scalatest.Transformer.apply(Transformer.scala:22)
> > > >> >       at org.scalatest.Transformer.apply(Transformer.scala:20)
> > > >> >       at
> > > >> org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > > >> >       at
> > > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > > >> >       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > > >> >       at
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > > >> >       at
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >> >       at
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >> >       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > > >> >       at
> > > >> org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > > >> >       at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > > >> >       at
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >> >       at
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >> >       at
> > > >>
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > > >> >       at
> > > >>
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > > >> >       at scala.collection.immutable.List.foreach(List.scala:392)
> > > >> >       at
> > > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > > >> >       at org.scalatest.SuperEngine.org
> > > >> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > > >> >       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > > >> >       at
> > > >> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > > >> >       at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > > >> >       at org.scalatest.Suite$class.run(Suite.scala:1147)
> > > >> >       at org.scalatest.FunSuite.org
> > > >> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > > >> >       at
> > > >>
> > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >> >       at
> > > >>
> > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >> >       at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > > >> >       at
> > org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > > >> >       at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > > >> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > > >> >       at
> > > >>
> > >
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > > >> >       at
> > > >>
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > > >> >       at
> > > >>
> > com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > > >> >       at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > > >> >       at
> > > >>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > >> > Method)
> > > >> >       at
> > > >>
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >> >       at
> > > >>
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >> >       at
> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > > >> >       at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > > >> >       at
> > > >>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > >> > Method)
> > > >> >       at
> > > >>
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >> >       at
> > > >>
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >> >       at
> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > > >> >       at
> > > >>
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > > >> >       at
> > > >>
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > > >> >       at
> > > >>
> > >
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > > >> >       at java.base/java.lang.Thread.run(Thread.java:835)
> > > >> >
> > > >> >
> > > >> > On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <
> vinoth@apache.org>
> > > >> wrote:
> > > >> >
> > > >> > > Wow.. Race condition :) ..
> > > >> > >
> > > >> > > Thanks for racing , Raymond!
> > > >> > >
> > > >> > > On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <
> > > >> xu.shiyan.raymond@gmail.com>
> > > >> > > wrote:
> > > >> > >
> > > >> > > > Hi Lian, it appears that you need to have
> spark-avro_2.11:2.4.4
> > in
> > > >> your
> > > >> > > > classpath.
> > > >> > > >
> > > >> > > >
> > > >> > > >
> > > >> > > > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <
> > > jiangok2006@gmail.com>
> > > >> > > wrote:
> > > >> > > >
> > > >> > > > > Thanks Balaji.
> > > >> > > > >
> > > >> > > > > My unit test failed due to dependency incompatibility. Any
> > idea
> > > >> will be
> > > >> > > > > highly appreciated!
> > > >> > > > >
> > > >> > > > >
> > > >> > > > > The test is copied from hudi quick start:
> > > >> > > > >
> > > >> > > > > import org.apache.hudi.QuickstartUtils._
> > > >> > > > >
> > > >> > > > > import scala.collection.JavaConversions._
> > > >> > > > > import org.apache.spark.sql.SaveMode._
> > > >> > > > > import org.apache.hudi.DataSourceReadOptions._
> > > >> > > > > import org.apache.hudi.DataSourceWriteOptions._
> > > >> > > > > import org.apache.hudi.config.HoodieWriteConfig._
> > > >> > > > >
> > > >> > > > > class InputOutputTest extends HudiBaseTest{
> > > >> > > > >
> > > >> > > > > val config = new SparkConf().setAppName(name)
> > > >> > > > >   config.set("spark.driver.allowMultipleContexts", "true")
> > > >> > > > >   config.set("spark.serializer",
> > > >> > > > > "org.apache.spark.serializer.KryoSerializer")
> > > >> > > > >   config.setMaster("local[*]").setAppName("Local Test")
> > > >> > > > >   val executionContext =
> > > >> > > > > SparkSession.builder().config(config).getOrCreate()
> > > >> > > > >
> > > >> > > > > val tableName = "hudi_trips_cow"
> > > >> > > > >   val basePath = "file:///tmp/hudi_trips_cow"
> > > >> > > > >   val dataGen = new DataGenerator
> > > >> > > > >
> > > >> > > > >   override def beforeAll(): Unit = {
> > > >> > > > >   }
> > > >> > > > >
> > > >> > > > >   test("Can create a hudi dataset") {
> > > >> > > > >     val inserts =
> > > convertToStringList(dataGen.generateInserts(10))
> > > >> > > > >     val df = executionContext.sparkSession.read.json(
> > > >> > > > >       executionContext.sparkContext.parallelize(inserts, 2))
> > > >> > > > >
> > > >> > > > >     df.write.format("hudi").
> > > >> > > > >       options(getQuickstartWriteConfigs).
> > > >> > > > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> > > >> > > > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> > > >> > > > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> > > >> > > > >       option(TABLE_NAME, tableName).
> > > >> > > > >       mode(Overwrite).
> > > >> > > > >       save(basePath)
> > > >> > > > >   }
> > > >> > > > > }
> > > >> > > > >
> > > >> > > > >
> > > >> > > > > The exception is:
> > > >> > > > >
> > > >> > > > > java.lang.NoClassDefFoundError:
> > > >> > > > org/apache/spark/sql/avro/SchemaConverters$
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > > >> > > > >         at
> > > >> > > > >
> > > >> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > > >> > > > >         at
> > > >> > > > >
> > > >>
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > > >> > > > >         at
> > > >> > > > >
> > > >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > > >> > > > >         at
> > > >> > > > >
> > > >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >> > > > >         at
> > > >> org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > > >> > > > >         at
> > > org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > > >> > > > >         at
> > org.scalatest.Transformer.apply(Transformer.scala:22)
> > > >> > > > >         at
> > org.scalatest.Transformer.apply(Transformer.scala:20)
> > > >> > > > >         at
> > > >> > > >
> org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > > >> > > > >         at
> > > >> > > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > > >> > > > >         at
> > > org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >> > > > >         at
> > > org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > > >> > > > >         at
> > > >> > > >
> org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > > >> > > > >         at
> org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > > >> > > > >         at
> > > scala.collection.immutable.List.foreach(List.scala:392)
> > > >> > > > >         at
> > > >> > > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > > >> > > > >         at org.scalatest.SuperEngine.org
> > > >> > > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > > >> > > > >         at
> > > >> org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > > >> > > > >         at
> > > >> > > > >
> > > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > > >> > > > >         at
> > org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > > >> > > > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> > > >> > > > >         at org.scalatest.FunSuite.org
> > > >> > > > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > > >> > > > >         at
> > > >> > > > >
> > > >>
> > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >> > > > >         at
> > > >> > > > >
> > > >>
> > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >> > > > >         at
> org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > > >> > > > >         at
> > > >> org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > > >> > > > >         at
> com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > > >> > > > >
> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > > >> > > > >         at
> > > >> > > > >
> > > >>
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > > >> > > > >         at
> > > >> > > > >
> > > >>
> > com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > > >> > > > >         at
> > > >> org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > > >> > > > >         at
> > > >> > > > >
> > > >>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > >> > > > > Method)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >> > > > >         at
> > > >> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > > >> > > > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown
> > > Source)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > > >> > > > >         at
> > > >> > > > >
> > > >>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > >> > > > > Method)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >> > > > >         at
> > > >> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > > >> > > > >         at java.base/java.lang.Thread.run(Thread.java:835)
> > > >> > > > > Caused by: java.lang.ClassNotFoundException:
> > > >> > > > > org.apache.spark.sql.avro.SchemaConverters$
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> > > >> > > > >         at
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> > > >> > > > >         at
> > > >> > > >
> java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> > > >> > > > >         ... 91 more
> > > >> > > > >
> > > >> > > > >
> > > >> > > > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> > > >> > > > > <v....@ymail.com.invalid> wrote:
> > > >> > > > >
> > > >> > > > > >  Thanks for using Hudi. Looking at pom definitions between
> > > >> 0.5.1 and
> > > >> > > > > > 0.5.2, I don't see any difference that could cause this
> > issue.
> > > >> As it
> > > >> > > > > works
> > > >> > > > > > with 0.5.2, I am assuming you are not blocked. Let us know
> > > >> otherwise.
> > > >> > > > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT,
> > Lian
> > > >> Jiang <
> > > >> > > > > > jiangok2006@gmail.com> wrote:
> > > >> > > > > >
> > > >> > > > > >  Thanks Vinoth.
> > > >> > > > > >
> > > >> > > > > > Below dependency has no conflict:
> > > >> > > > > >
> > > >> > > > > > compile group: 'org.apache.spark', name:
> 'spark-core_2.11',
> > > >> version:
> > > >> > > > > > '2.3.0'
> > > >> > > > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> > > >> version:
> > > >> > > > > '2.3.0'
> > > >> > > > > > compile group: 'org.scala-lang', name: 'scala-library',
> > > version:
> > > >> > > > > '2.11.11'
> > > >> > > > > > compile group: 'com.github.scopt', name: 'scopt_2.11',
> > > version:
> > > >> > > '3.7.1'
> > > >> > > > > > compile group: 'com.amazonaws', name: 'aws-java-sdk',
> > version:
> > > >> > > > '1.11.297'
> > > >> > > > > > compile group: 'org.apache.hudi', name:
> > > >> 'hudi-spark-bundle_2.11',
> > > >> > > > > > version: '0.5.2-incubating'
> > > >> > > > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > >> > > > > > testCompile group: 'org.scalatest', name:
> 'scalatest_2.11',
> > > >> version:
> > > >> > > > > > '3.2.0-SNAP7'
> > > >> > > > > > testCompile group: 'org.mockito', name:
> > 'mockito-scala_2.11',
> > > >> > > version:
> > > >> > > > > > '1.5.12'
> > > >> > > > > > compile group: 'org.apache.iceberg', name: 'iceberg-api',
> > > >> version:
> > > >> > > > > > '0.8.0-incubating'
> > > >> > > > > >
> > > >> > > > > > Cheers!
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <
> > > >> vinoth@apache.org>
> > > >> > > > > wrote:
> > > >> > > > > >
> > > >> > > > > > > Hi Leon,
> > > >> > > > > > >
> > > >> > > > > > > Sorry for the late reply.  Seems like a version mismatch
> > for
> > > >> > > > mockito..
> > > >> > > > > > > I see you are already trying to exclude it though..
> Could
> > > you
> > > >> share
> > > >> > > > the
> > > >> > > > > > > full stack trace?
> > > >> > > > > > >
> > > >> > > > > > >
> > > >> > > > > > >
> > > >> > > > > > >
> > > >> > > > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <
> > > >> jiangok2006@gmail.com>
> > > >> > > > > > wrote:
> > > >> > > > > > >
> > > >> > > > > > > > Hi,
> > > >> > > > > > > >
> > > >> > > > > > > > I am using hudi in a scala gradle project:
> > > >> > > > > > > >
> > > >> > > > > > > > dependencies {
> > > >> > > > > > > >    compile group: 'org.apache.spark', name:
> > > >> 'spark-core_2.11',
> > > >> > > > > version:
> > > >> > > > > > > > '2.4.4'
> > > >> > > > > > > >    compile group: 'org.apache.spark', name:
> > > >> 'spark-sql_2.11',
> > > >> > > > > version:
> > > >> > > > > > > > '2.4.4'
> > > >> > > > > > > >    compile group: 'org.scala-lang', name:
> > 'scala-library',
> > > >> > > version:
> > > >> > > > > > > > '2.11.11'
> > > >> > > > > > > >    compile group: 'com.github.scopt', name:
> > 'scopt_2.11',
> > > >> > > version:
> > > >> > > > > > > '3.7.1'
> > > >> > > > > > > >    compile group: 'org.apache.spark', name:
> > > >> 'spark-avro_2.11',
> > > >> > > > > version:
> > > >> > > > > > > > '2.4.4'
> > > >> > > > > > > >    compile group: 'com.amazonaws', name:
> 'aws-java-sdk',
> > > >> version:
> > > >> > > > > > > > '1.11.297'
> > > >> > > > > > > >    compile group: 'com.zillow.datacontracts', name:
> > > >> > > > > > > > 'contract-evaluation-library', version:
> > > >> '0.1.0.master.98a438b'
> > > >> > > > > > > >    compile (group: 'org.apache.hudi', name:
> > > >> 'hudi-spark_2.11',
> > > >> > > > > > > > version: '0.5.1-incubating') {
> > > >> > > > > > > >        exclude group: 'org.scala-lang', module:
> > > >> 'scala-library'
> > > >> > > > > > > >        exclude group: 'org.scalatest', module:
> > > >> 'scalatest_2.12'
> > > >> > > > > > > >    }
> > > >> > > > > > > >
> > > >> > > > > > > >    testCompile group: 'junit', name: 'junit', version:
> > > >> '4.12'
> > > >> > > > > > > >    testCompile group: 'org.scalatest', name:
> > > >> 'scalatest_2.11',
> > > >> > > > > > > > version: '3.2.0-SNAP7'
> > > >> > > > > > > >    testCompile group: 'org.mockito', name:
> > > >> 'mockito-scala_2.11',
> > > >> > > > > > > > version: '1.5.12'
> > > >> > > > > > > > }
> > > >> > > > > > > >
> > > >> > > > > > > > Below code throws exception '
> > > >> > > > > > > > java.lang.NoSuchMethodError:
> > > >> > > > > > > >
> > > >> > > > > > > >
> > > >> > > > > > >
> > > >> > > > > >
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > >> > > > > > > >
> > > >> > > > > > > > import org.junit.runner.RunWith
> > > >> > > > > > > > import org.scalatest.FunSuite
> > > >> > > > > > > > import org.scalatest.junit.JUnitRunner
> > > >> > > > > > > > import org.scalatest.mockito.MockitoSugar
> > > >> > > > > > > >
> > > >> > > > > > > > @RunWith(classOf[JUnitRunner])
> > > >> > > > > > > > class BaseTest extends FunSuite with MockitoSugar {
> > > >> > > > > > > > }
> > > >> > > > > > > >
> > > >> > > > > > > > Removing org.apache.hudi from the dependency list will
> > > make
> > > >> the
> > > >> > > > code
> > > >> > > > > > > > work. Does anybody know how to include hudi dependency
> > > >> without
> > > >> > > > > > > > conflicting with the test?
> > > >> > > > > > > >
> > > >> > > > > > > > Appreciate any help!
> > > >> > > > > > > >
> > > >> > > > > > > > Regards
> > > >> > > > > > > >
> > > >> > > > > > > > Leon
> > > >> > > > > > > >
> > > >> > > > > > >
> > > >> > > > > >
> > > >> > > > > >
> > > >> > > > > > --
> > > >> > > > > >
> > > >> > > > > > Create your own email signature
> > > >> > > > > > <
> > > >> > > > > >
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > >> > > > > > >
> > > >> > > > > >
> > > >> > > > >
> > > >> > > > >
> > > >> > > > >
> > > >> > > > > --
> > > >> > > > >
> > > >> > > > > Create your own email signature
> > > >> > > > > <
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >>
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > >> > > > > >
> > > >> > > > >
> > > >> > > >
> > > >> > >
> > > >> >
> > > >> >
> > > >> > --
> > > >> >
> > > >> > Create your own email signature
> > > >> > <
> > > >>
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > >> >
> > > >> >
> > > >>
> > > >
> > > >
> > > > --
> > > >
> > > > Create your own email signature
> > > > <
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > >
> > > >
> > >
> > >
> > > --
> > >
> > > Create your own email signature
> > > <
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > >
> > >
> >
>
>
> --
>
> Create your own email signature
> <
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >
>

Re: hudi dependency conflicts for test

Posted by Lian Jiang <ji...@gmail.com>.
I added a comment in this wiki. Hope this works. Thanks.

On Sun, May 24, 2020 at 2:32 AM Vinoth Chandar <vi...@apache.org> wrote:

> Great team work everyone!
>
> Anything worth documenting here?
> https://cwiki.apache.org/confluence/display/HUDI/Troubleshooting+Guide
>
> On Thu, May 21, 2020 at 11:02 PM Lian Jiang <ji...@gmail.com> wrote:
>
> > The root cause is that I need to use java 8 instead of the default java
> 11
> > in intellij. Thanks everyone for helping and cheers!
> >
> > On Thu, May 21, 2020 at 1:09 PM Lian Jiang <ji...@gmail.com>
> wrote:
> >
> > > The examples in quick start work for me in spark-shell. I am trying to
> > use
> > > scala unit test to make these examples easier to repeat in CICD given
> > hudi
> > > is still in incubating.
> > >
> > > Below is the new set of dependencies as instructed:
> > >
> > > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > '2.4.5'
> > > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > '2.4.5'
> > > compile group: 'org.apache.spark', name: 'spark-avro_2.11', version:
> > '2.4.4'
> > > compile group: 'org.scala-lang', name: 'scala-library', version:
> > '2.11.11'
> > > compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
> > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> '1.11.297'
> > > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > version: '0.5.2-incubating'
> > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> > '3.2.0-SNAP7'
> > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version:
> > '1.5.12'
> > >
> > > Yet, there is another version related exception:
> > >
> > > java.lang.IllegalArgumentException: Unsupported class file major
> version
> > 56
> > >       at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
> > >       at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
> > >       at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
> > >       at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
> > >       at
> >
> org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
> > >       at
> >
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
> > >       at
> >
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
> > >       at
> >
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
> > >       at
> >
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> > >       at
> >
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> > >       at
> >
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
> > >       at
> scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
> > >       at
> > scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
> > >       at
> >
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
> > >       at
> >
> org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
> > >       at
> > org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
> > >       at
> > org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
> > >       at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
> > >       at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
> > >       at
> >
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
> > >       at
> >
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
> > >       at scala.collection.immutable.List.foreach(List.scala:392)
> > >       at
> >
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
> > >       at
> > org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
> > >       at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
> > >       at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
> > >       at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1409)
> > >       at
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > >       at
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> > >       at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
> > >       at org.apache.spark.rdd.RDD.take(RDD.scala:1382)
> > >       at
> > org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply$mcZ$sp(RDD.scala:1517)
> > >       at
> > org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
> > >       at
> > org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
> > >       at
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > >       at
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> > >       at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
> > >       at org.apache.spark.rdd.RDD.isEmpty(RDD.scala:1516)
> > >       at
> >
> org.apache.spark.api.java.JavaRDDLike$class.isEmpty(JavaRDDLike.scala:544)
> > >       at
> >
> org.apache.spark.api.java.AbstractJavaRDDLike.isEmpty(JavaRDDLike.scala:45)
> > >       at
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:142)
> > >       at
> > org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > >       at
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > >       at
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > >       at
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > >       at
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > >       at
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > >       at
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > >       at
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > >       at
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > >       at
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > >       at
> > org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > >       at
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:83)
> > >       at
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:81)
> > >       at
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >       at
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >       at
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
> > >       at
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
> > >       at
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
> > >       at
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > >       at
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > >       at
> > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > >       at
> > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > >       at
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > >       at
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >       at
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >       at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > >       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > >       at org.scalatest.Transformer.apply(Transformer.scala:22)
> > >       at org.scalatest.Transformer.apply(Transformer.scala:20)
> > >       at
> org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > >       at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > >       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > >       at
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > >       at
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >       at
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > >       at
> org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > >       at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > >       at
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >       at
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >       at
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > >       at
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > >       at scala.collection.immutable.List.foreach(List.scala:392)
> > >       at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > >       at org.scalatest.SuperEngine.org
> > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > >       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > >       at
> > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > >       at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > >       at org.scalatest.Suite$class.run(Suite.scala:1147)
> > >       at org.scalatest.FunSuite.org
> > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > >       at
> > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >       at
> > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >       at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > >       at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > >       at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > >       at
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > >       at
> > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > >       at
> > com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > >       at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > >       at
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > >       at
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > >       at
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > >       at
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > >       at
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > >       at
> > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> > >       at
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >       at
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >       at
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >       at
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >       at
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > >       at
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > >       at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > >       at
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > >       at
> > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> > >       at
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >       at
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >       at
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >       at
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >       at
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > >       at
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > >       at
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > >       at
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > >       at
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > >       at
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > >       at
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > >       at
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > >       at java.base/java.lang.Thread.run(Thread.java:835)
> > >
> > > Does anyone have a working example for unit test instead of
> spark-shell?
> > Thanks.
> > >
> > >
> > >
> > > On Thu, May 21, 2020 at 12:21 PM Lamber-Ken <la...@apache.org>
> > wrote:
> > >
> > >> hello jiang,
> > >>
> > >> Please try following demo, need spark(>=2.4.4)
> > >>
> > >> ------------------------------------------------------
> > >>
> > >> export
> SPARK_HOME=/work/BigData/install/spark/spark-2.4.5-bin-hadoop2.7
> > >> ${SPARK_HOME}/bin/spark-shell \
> > >>     --packages
> > >>
> >
> org.apache.hudi:hudi-spark-bundle_2.11:0.5.2-incubating,org.apache.spark:spark-avro_2.11:2.4.4
> > >> \
> > >>     --conf
> 'spark.serializer=org.apache.spark.serializer.KryoSerializer'
> > >>
> > >> import org.apache.spark.sql.functions._
> > >>
> > >> val tableName = "hudi_mor_table"
> > >> val basePath = "file:///tmp/hudi_cow_tablen"
> > >>
> > >> val hudiOptions = Map[String,String](
> > >>   "hoodie.upsert.shuffle.parallelism" -> "10",
> > >>   "hoodie.datasource.write.recordkey.field" -> "key",
> > >>   "hoodie.datasource.write.partitionpath.field" -> "dt",
> > >>   "hoodie.table.name" -> tableName,
> > >>   "hoodie.datasource.write.precombine.field" -> "timestamp"
> > >> )
> > >>
> > >> val inputDF = spark.range(0, 5).
> > >>    withColumn("key", $"id").
> > >>    withColumn("data", lit("data")).
> > >>    withColumn("timestamp", current_timestamp()).
> > >>    withColumn("dt", date_format($"timestamp", "yyyy-MM-dd"))
> > >>
> > >> inputDF.write.format("org.apache.hudi").
> > >>   options(hudiOptions).
> > >>   mode("Overwrite").
> > >>   save(basePath)
> > >>
> > >> spark.read.format("org.apache.hudi").load(basePath + "/*/*").show();
> > >>
> > >> ------------------------------------------------------
> > >>
> > >> Best,
> > >> Lamber-Ken
> > >>
> > >>
> > >> On 2020/05/21 18:59:02, Lian Jiang <ji...@gmail.com> wrote:
> > >> > Thanks Shiyan and Vinoth. Unfortunately, adding
> > >> > org.apache.spark:spark-avro_2.11:2.4.4 throws another version
> related
> > >> > exception:
> > >> >
> > >> > java.lang.NoSuchMethodError:
> > >> >
> > >>
> >
> org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
> > >> >       at
> > >>
> >
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
> > >> >       at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> > >> >       at
> > scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> > >> >       at
> > >> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> > >> >       at
> > >> org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
> > >> >       at
> > >>
> >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > >> >       at
> > >>
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > >> >       at
> > >> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > >> >       at
> > >>
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > >> >       at
> > >> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > >> >       at
> > >>
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > >> >       at
> > >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > >> >       at
> > >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > >> >       at
> > >>
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > >> >       at
> > >>
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >> >       at
> > >>
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >> >       at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > >> >       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > >> >       at org.scalatest.Transformer.apply(Transformer.scala:22)
> > >> >       at org.scalatest.Transformer.apply(Transformer.scala:20)
> > >> >       at
> > >> org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > >> >       at
> > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > >> >       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > >> >       at
> > >>
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > >> >       at
> > >>
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >> >       at
> > >>
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >> >       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > >> >       at
> > >> org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > >> >       at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > >> >       at
> > >>
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >> >       at
> > >>
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >> >       at
> > >>
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > >> >       at
> > >>
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > >> >       at scala.collection.immutable.List.foreach(List.scala:392)
> > >> >       at
> > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > >> >       at org.scalatest.SuperEngine.org
> > >> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > >> >       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > >> >       at
> > >> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > >> >       at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > >> >       at org.scalatest.Suite$class.run(Suite.scala:1147)
> > >> >       at org.scalatest.FunSuite.org
> > >> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > >> >       at
> > >>
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >> >       at
> > >>
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >> >       at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > >> >       at
> org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > >> >       at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > >> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > >> >       at
> > >>
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > >> >       at
> > >> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > >> >       at
> > >>
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > >> >       at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > >> >       at
> > >>
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > >> >       at
> > >>
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > >> >       at
> > >>
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > >> >       at
> > >>
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > >> >       at
> > >>
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > >> >       at
> > >> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > >> > Method)
> > >> >       at
> > >>
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> >       at
> > >>
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >> >       at
> > >>
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >> >       at
> > >>
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >> >       at
> > >>
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > >> >       at
> > >>
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > >> >       at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > >> >       at
> > >>
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > >> >       at
> > >> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > >> > Method)
> > >> >       at
> > >>
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> >       at
> > >>
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >> >       at
> > >>
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >> >       at
> > >>
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >> >       at
> > >>
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > >> >       at
> > >>
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > >> >       at
> > >>
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > >> >       at
> > >>
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > >> >       at
> > >>
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > >> >       at
> > >>
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > >> >       at
> > >>
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > >> >       at
> > >>
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > >> >       at java.base/java.lang.Thread.run(Thread.java:835)
> > >> >
> > >> >
> > >> > On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <vi...@apache.org>
> > >> wrote:
> > >> >
> > >> > > Wow.. Race condition :) ..
> > >> > >
> > >> > > Thanks for racing , Raymond!
> > >> > >
> > >> > > On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <
> > >> xu.shiyan.raymond@gmail.com>
> > >> > > wrote:
> > >> > >
> > >> > > > Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4
> in
> > >> your
> > >> > > > classpath.
> > >> > > >
> > >> > > >
> > >> > > >
> > >> > > > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <
> > jiangok2006@gmail.com>
> > >> > > wrote:
> > >> > > >
> > >> > > > > Thanks Balaji.
> > >> > > > >
> > >> > > > > My unit test failed due to dependency incompatibility. Any
> idea
> > >> will be
> > >> > > > > highly appreciated!
> > >> > > > >
> > >> > > > >
> > >> > > > > The test is copied from hudi quick start:
> > >> > > > >
> > >> > > > > import org.apache.hudi.QuickstartUtils._
> > >> > > > >
> > >> > > > > import scala.collection.JavaConversions._
> > >> > > > > import org.apache.spark.sql.SaveMode._
> > >> > > > > import org.apache.hudi.DataSourceReadOptions._
> > >> > > > > import org.apache.hudi.DataSourceWriteOptions._
> > >> > > > > import org.apache.hudi.config.HoodieWriteConfig._
> > >> > > > >
> > >> > > > > class InputOutputTest extends HudiBaseTest{
> > >> > > > >
> > >> > > > > val config = new SparkConf().setAppName(name)
> > >> > > > >   config.set("spark.driver.allowMultipleContexts", "true")
> > >> > > > >   config.set("spark.serializer",
> > >> > > > > "org.apache.spark.serializer.KryoSerializer")
> > >> > > > >   config.setMaster("local[*]").setAppName("Local Test")
> > >> > > > >   val executionContext =
> > >> > > > > SparkSession.builder().config(config).getOrCreate()
> > >> > > > >
> > >> > > > > val tableName = "hudi_trips_cow"
> > >> > > > >   val basePath = "file:///tmp/hudi_trips_cow"
> > >> > > > >   val dataGen = new DataGenerator
> > >> > > > >
> > >> > > > >   override def beforeAll(): Unit = {
> > >> > > > >   }
> > >> > > > >
> > >> > > > >   test("Can create a hudi dataset") {
> > >> > > > >     val inserts =
> > convertToStringList(dataGen.generateInserts(10))
> > >> > > > >     val df = executionContext.sparkSession.read.json(
> > >> > > > >       executionContext.sparkContext.parallelize(inserts, 2))
> > >> > > > >
> > >> > > > >     df.write.format("hudi").
> > >> > > > >       options(getQuickstartWriteConfigs).
> > >> > > > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> > >> > > > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> > >> > > > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> > >> > > > >       option(TABLE_NAME, tableName).
> > >> > > > >       mode(Overwrite).
> > >> > > > >       save(basePath)
> > >> > > > >   }
> > >> > > > > }
> > >> > > > >
> > >> > > > >
> > >> > > > > The exception is:
> > >> > > > >
> > >> > > > > java.lang.NoClassDefFoundError:
> > >> > > > org/apache/spark/sql/avro/SchemaConverters$
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > >> > > > >         at
> > >> > > > >
> > >> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > >> > > > >         at
> > >> > > > >
> > >> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > >> > > > >         at
> > >> > > > >
> > >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > >> > > > >         at
> > >> > > > >
> > >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >> > > > >         at
> > >> org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > >> > > > >         at
> > org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > >> > > > >         at
> org.scalatest.Transformer.apply(Transformer.scala:22)
> > >> > > > >         at
> org.scalatest.Transformer.apply(Transformer.scala:20)
> > >> > > > >         at
> > >> > > > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > >> > > > >         at
> > >> > > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > >> > > > >         at
> > org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >> > > > >         at
> > org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > >> > > > >         at
> > >> > > > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > >> > > > >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > >> > > > >         at
> > scala.collection.immutable.List.foreach(List.scala:392)
> > >> > > > >         at
> > >> > > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > >> > > > >         at org.scalatest.SuperEngine.org
> > >> > > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > >> > > > >         at
> > >> org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > >> > > > >         at
> > >> > > > >
> > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > >> > > > >         at
> org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > >> > > > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> > >> > > > >         at org.scalatest.FunSuite.org
> > >> > > > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > >> > > > >         at
> > >> > > > >
> > >>
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >> > > > >         at
> > >> > > > >
> > >>
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >> > > > >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > >> > > > >         at
> > >> org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > >> > > > >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > >> > > > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > >> > > > >         at
> > >> > > > >
> > >> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > >> > > > >         at
> > >> > > > >
> > >>
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > >> > > > >         at
> > >> org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > >> > > > >         at
> > >> > > > >
> > >> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > >> > > > > Method)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > > >         at
> > >> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > >> > > > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown
> > Source)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > >> > > > >         at
> > >> > > > >
> > >> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > >> > > > > Method)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > > >         at
> > >> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > >> > > > >         at java.base/java.lang.Thread.run(Thread.java:835)
> > >> > > > > Caused by: java.lang.ClassNotFoundException:
> > >> > > > > org.apache.spark.sql.avro.SchemaConverters$
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> > >> > > > >         at
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> > >> > > > >         at
> > >> > > > java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> > >> > > > >         ... 91 more
> > >> > > > >
> > >> > > > >
> > >> > > > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> > >> > > > > <v....@ymail.com.invalid> wrote:
> > >> > > > >
> > >> > > > > >  Thanks for using Hudi. Looking at pom definitions between
> > >> 0.5.1 and
> > >> > > > > > 0.5.2, I don't see any difference that could cause this
> issue.
> > >> As it
> > >> > > > > works
> > >> > > > > > with 0.5.2, I am assuming you are not blocked. Let us know
> > >> otherwise.
> > >> > > > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT,
> Lian
> > >> Jiang <
> > >> > > > > > jiangok2006@gmail.com> wrote:
> > >> > > > > >
> > >> > > > > >  Thanks Vinoth.
> > >> > > > > >
> > >> > > > > > Below dependency has no conflict:
> > >> > > > > >
> > >> > > > > > compile group: 'org.apache.spark', name: 'spark-core_2.11',
> > >> version:
> > >> > > > > > '2.3.0'
> > >> > > > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> > >> version:
> > >> > > > > '2.3.0'
> > >> > > > > > compile group: 'org.scala-lang', name: 'scala-library',
> > version:
> > >> > > > > '2.11.11'
> > >> > > > > > compile group: 'com.github.scopt', name: 'scopt_2.11',
> > version:
> > >> > > '3.7.1'
> > >> > > > > > compile group: 'com.amazonaws', name: 'aws-java-sdk',
> version:
> > >> > > > '1.11.297'
> > >> > > > > > compile group: 'org.apache.hudi', name:
> > >> 'hudi-spark-bundle_2.11',
> > >> > > > > > version: '0.5.2-incubating'
> > >> > > > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > >> > > > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > >> version:
> > >> > > > > > '3.2.0-SNAP7'
> > >> > > > > > testCompile group: 'org.mockito', name:
> 'mockito-scala_2.11',
> > >> > > version:
> > >> > > > > > '1.5.12'
> > >> > > > > > compile group: 'org.apache.iceberg', name: 'iceberg-api',
> > >> version:
> > >> > > > > > '0.8.0-incubating'
> > >> > > > > >
> > >> > > > > > Cheers!
> > >> > > > > >
> > >> > > > > >
> > >> > > > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <
> > >> vinoth@apache.org>
> > >> > > > > wrote:
> > >> > > > > >
> > >> > > > > > > Hi Leon,
> > >> > > > > > >
> > >> > > > > > > Sorry for the late reply.  Seems like a version mismatch
> for
> > >> > > > mockito..
> > >> > > > > > > I see you are already trying to exclude it though.. Could
> > you
> > >> share
> > >> > > > the
> > >> > > > > > > full stack trace?
> > >> > > > > > >
> > >> > > > > > >
> > >> > > > > > >
> > >> > > > > > >
> > >> > > > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <
> > >> jiangok2006@gmail.com>
> > >> > > > > > wrote:
> > >> > > > > > >
> > >> > > > > > > > Hi,
> > >> > > > > > > >
> > >> > > > > > > > I am using hudi in a scala gradle project:
> > >> > > > > > > >
> > >> > > > > > > > dependencies {
> > >> > > > > > > >    compile group: 'org.apache.spark', name:
> > >> 'spark-core_2.11',
> > >> > > > > version:
> > >> > > > > > > > '2.4.4'
> > >> > > > > > > >    compile group: 'org.apache.spark', name:
> > >> 'spark-sql_2.11',
> > >> > > > > version:
> > >> > > > > > > > '2.4.4'
> > >> > > > > > > >    compile group: 'org.scala-lang', name:
> 'scala-library',
> > >> > > version:
> > >> > > > > > > > '2.11.11'
> > >> > > > > > > >    compile group: 'com.github.scopt', name:
> 'scopt_2.11',
> > >> > > version:
> > >> > > > > > > '3.7.1'
> > >> > > > > > > >    compile group: 'org.apache.spark', name:
> > >> 'spark-avro_2.11',
> > >> > > > > version:
> > >> > > > > > > > '2.4.4'
> > >> > > > > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk',
> > >> version:
> > >> > > > > > > > '1.11.297'
> > >> > > > > > > >    compile group: 'com.zillow.datacontracts', name:
> > >> > > > > > > > 'contract-evaluation-library', version:
> > >> '0.1.0.master.98a438b'
> > >> > > > > > > >    compile (group: 'org.apache.hudi', name:
> > >> 'hudi-spark_2.11',
> > >> > > > > > > > version: '0.5.1-incubating') {
> > >> > > > > > > >        exclude group: 'org.scala-lang', module:
> > >> 'scala-library'
> > >> > > > > > > >        exclude group: 'org.scalatest', module:
> > >> 'scalatest_2.12'
> > >> > > > > > > >    }
> > >> > > > > > > >
> > >> > > > > > > >    testCompile group: 'junit', name: 'junit', version:
> > >> '4.12'
> > >> > > > > > > >    testCompile group: 'org.scalatest', name:
> > >> 'scalatest_2.11',
> > >> > > > > > > > version: '3.2.0-SNAP7'
> > >> > > > > > > >    testCompile group: 'org.mockito', name:
> > >> 'mockito-scala_2.11',
> > >> > > > > > > > version: '1.5.12'
> > >> > > > > > > > }
> > >> > > > > > > >
> > >> > > > > > > > Below code throws exception '
> > >> > > > > > > > java.lang.NoSuchMethodError:
> > >> > > > > > > >
> > >> > > > > > > >
> > >> > > > > > >
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > >> > > > > > > >
> > >> > > > > > > > import org.junit.runner.RunWith
> > >> > > > > > > > import org.scalatest.FunSuite
> > >> > > > > > > > import org.scalatest.junit.JUnitRunner
> > >> > > > > > > > import org.scalatest.mockito.MockitoSugar
> > >> > > > > > > >
> > >> > > > > > > > @RunWith(classOf[JUnitRunner])
> > >> > > > > > > > class BaseTest extends FunSuite with MockitoSugar {
> > >> > > > > > > > }
> > >> > > > > > > >
> > >> > > > > > > > Removing org.apache.hudi from the dependency list will
> > make
> > >> the
> > >> > > > code
> > >> > > > > > > > work. Does anybody know how to include hudi dependency
> > >> without
> > >> > > > > > > > conflicting with the test?
> > >> > > > > > > >
> > >> > > > > > > > Appreciate any help!
> > >> > > > > > > >
> > >> > > > > > > > Regards
> > >> > > > > > > >
> > >> > > > > > > > Leon
> > >> > > > > > > >
> > >> > > > > > >
> > >> > > > > >
> > >> > > > > >
> > >> > > > > > --
> > >> > > > > >
> > >> > > > > > Create your own email signature
> > >> > > > > > <
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > >> > > > > > >
> > >> > > > > >
> > >> > > > >
> > >> > > > >
> > >> > > > >
> > >> > > > > --
> > >> > > > >
> > >> > > > > Create your own email signature
> > >> > > > > <
> > >> > > > >
> > >> > > >
> > >> > >
> > >>
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > >> > > > > >
> > >> > > > >
> > >> > > >
> > >> > >
> > >> >
> > >> >
> > >> > --
> > >> >
> > >> > Create your own email signature
> > >> > <
> > >>
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > >> >
> > >> >
> > >>
> > >
> > >
> > > --
> > >
> > > Create your own email signature
> > > <
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > >
> > >
> >
> >
> > --
> >
> > Create your own email signature
> > <
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > >
> >
>


-- 

Create your own email signature
<https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>

Re: hudi dependency conflicts for test

Posted by Vinoth Chandar <vi...@apache.org>.
Great team work everyone!

Anything worth documenting here?
https://cwiki.apache.org/confluence/display/HUDI/Troubleshooting+Guide

On Thu, May 21, 2020 at 11:02 PM Lian Jiang <ji...@gmail.com> wrote:

> The root cause is that I need to use java 8 instead of the default java 11
> in intellij. Thanks everyone for helping and cheers!
>
> On Thu, May 21, 2020 at 1:09 PM Lian Jiang <ji...@gmail.com> wrote:
>
> > The examples in quick start work for me in spark-shell. I am trying to
> use
> > scala unit test to make these examples easier to repeat in CICD given
> hudi
> > is still in incubating.
> >
> > Below is the new set of dependencies as instructed:
> >
> > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> '2.4.5'
> > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> '2.4.5'
> > compile group: 'org.apache.spark', name: 'spark-avro_2.11', version:
> '2.4.4'
> > compile group: 'org.scala-lang', name: 'scala-library', version:
> '2.11.11'
> > compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
> > compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
> > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> version: '0.5.2-incubating'
> > testCompile group: 'junit', name: 'junit', version: '4.12'
> > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> '3.2.0-SNAP7'
> > testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version:
> '1.5.12'
> >
> > Yet, there is another version related exception:
> >
> > java.lang.IllegalArgumentException: Unsupported class file major version
> 56
> >       at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
> >       at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
> >       at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
> >       at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
> >       at
> org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
> >       at
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
> >       at
> org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
> >       at
> scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
> >       at
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> >       at
> scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> >       at
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
> >       at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
> >       at
> scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
> >       at
> scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
> >       at
> org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
> >       at
> org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
> >       at
> org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
> >       at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
> >       at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
> >       at
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
> >       at
> org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
> >       at scala.collection.immutable.List.foreach(List.scala:392)
> >       at
> org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
> >       at
> org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
> >       at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
> >       at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
> >       at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1409)
> >       at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> >       at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> >       at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
> >       at org.apache.spark.rdd.RDD.take(RDD.scala:1382)
> >       at
> org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply$mcZ$sp(RDD.scala:1517)
> >       at
> org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
> >       at
> org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
> >       at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> >       at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> >       at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
> >       at org.apache.spark.rdd.RDD.isEmpty(RDD.scala:1516)
> >       at
> org.apache.spark.api.java.JavaRDDLike$class.isEmpty(JavaRDDLike.scala:544)
> >       at
> org.apache.spark.api.java.AbstractJavaRDDLike.isEmpty(JavaRDDLike.scala:45)
> >       at
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:142)
> >       at
> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> >       at
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> >       at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> >       at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> >       at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> >       at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> >       at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> >       at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> >       at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> >       at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> >       at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> >       at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:83)
> >       at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:81)
> >       at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >       at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >       at
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
> >       at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
> >       at
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
> >       at
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> >       at
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> >       at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> >       at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> >       at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> >       at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >       at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >       at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> >       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> >       at org.scalatest.Transformer.apply(Transformer.scala:22)
> >       at org.scalatest.Transformer.apply(Transformer.scala:20)
> >       at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> >       at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> >       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> >       at
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> >       at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> >       at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >       at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> >       at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> >       at scala.collection.immutable.List.foreach(List.scala:392)
> >       at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> >       at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> >       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> >       at
> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> >       at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> >       at org.scalatest.Suite$class.run(Suite.scala:1147)
> >       at org.scalatest.FunSuite.org
> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >       at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> >       at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> >       at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> >       at
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> >       at
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> >       at
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> >       at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> >       at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> >       at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> >       at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> >       at
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> >       at
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> >       at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> >       at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >       at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >       at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >       at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >       at
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> >       at
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> >       at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> >       at
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> >       at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> >       at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >       at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >       at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >       at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >       at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> >       at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> >       at
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> >       at
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> >       at
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> >       at
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> >       at
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> >       at
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> >       at java.base/java.lang.Thread.run(Thread.java:835)
> >
> > Does anyone have a working example for unit test instead of spark-shell?
> Thanks.
> >
> >
> >
> > On Thu, May 21, 2020 at 12:21 PM Lamber-Ken <la...@apache.org>
> wrote:
> >
> >> hello jiang,
> >>
> >> Please try following demo, need spark(>=2.4.4)
> >>
> >> ------------------------------------------------------
> >>
> >> export SPARK_HOME=/work/BigData/install/spark/spark-2.4.5-bin-hadoop2.7
> >> ${SPARK_HOME}/bin/spark-shell \
> >>     --packages
> >>
> org.apache.hudi:hudi-spark-bundle_2.11:0.5.2-incubating,org.apache.spark:spark-avro_2.11:2.4.4
> >> \
> >>     --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer'
> >>
> >> import org.apache.spark.sql.functions._
> >>
> >> val tableName = "hudi_mor_table"
> >> val basePath = "file:///tmp/hudi_cow_tablen"
> >>
> >> val hudiOptions = Map[String,String](
> >>   "hoodie.upsert.shuffle.parallelism" -> "10",
> >>   "hoodie.datasource.write.recordkey.field" -> "key",
> >>   "hoodie.datasource.write.partitionpath.field" -> "dt",
> >>   "hoodie.table.name" -> tableName,
> >>   "hoodie.datasource.write.precombine.field" -> "timestamp"
> >> )
> >>
> >> val inputDF = spark.range(0, 5).
> >>    withColumn("key", $"id").
> >>    withColumn("data", lit("data")).
> >>    withColumn("timestamp", current_timestamp()).
> >>    withColumn("dt", date_format($"timestamp", "yyyy-MM-dd"))
> >>
> >> inputDF.write.format("org.apache.hudi").
> >>   options(hudiOptions).
> >>   mode("Overwrite").
> >>   save(basePath)
> >>
> >> spark.read.format("org.apache.hudi").load(basePath + "/*/*").show();
> >>
> >> ------------------------------------------------------
> >>
> >> Best,
> >> Lamber-Ken
> >>
> >>
> >> On 2020/05/21 18:59:02, Lian Jiang <ji...@gmail.com> wrote:
> >> > Thanks Shiyan and Vinoth. Unfortunately, adding
> >> > org.apache.spark:spark-avro_2.11:2.4.4 throws another version related
> >> > exception:
> >> >
> >> > java.lang.NoSuchMethodError:
> >> >
> >>
> org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
> >> >       at
> >>
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
> >> >       at
> >>
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
> >> >       at
> >>
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
> >> >       at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> >> >       at
> scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> >> >       at
> >> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> >> >       at
> >> org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
> >> >       at
> >>
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
> >> >       at
> >>
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> >> >       at
> >>
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> >> >       at
> >> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> >> >       at
> >>
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> >> >       at
> >>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> >> >       at
> >>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> >> >       at
> >>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> >> >       at
> >>
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> >> >       at
> >>
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> >> >       at
> >>
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> >> >       at
> >>
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> >> >       at
> >>
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> >> >       at
> >> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> >> >       at
> >>
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> >> >       at
> >>
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> >> >       at
> >>
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >> >       at
> >>
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >> >       at
> >>
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> >> >       at
> >>
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> >> >       at
> >>
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> >> >       at
> >>
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> >> >       at
> >>
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> >> >       at
> >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> >> >       at
> >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> >> >       at
> >>
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> >> >       at
> >>
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >> >       at
> >>
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >> >       at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> >> >       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> >> >       at org.scalatest.Transformer.apply(Transformer.scala:22)
> >> >       at org.scalatest.Transformer.apply(Transformer.scala:20)
> >> >       at
> >> org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> >> >       at
> org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> >> >       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> >> >       at
> >>
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> >> >       at
> >>
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >> >       at
> >>
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >> >       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> >> >       at
> >> org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> >> >       at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> >> >       at
> >>
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >> >       at
> >>
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >> >       at
> >>
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> >> >       at
> >>
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> >> >       at scala.collection.immutable.List.foreach(List.scala:392)
> >> >       at
> org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> >> >       at org.scalatest.SuperEngine.org
> >> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> >> >       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> >> >       at
> >> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> >> >       at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> >> >       at org.scalatest.Suite$class.run(Suite.scala:1147)
> >> >       at org.scalatest.FunSuite.org
> >> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> >> >       at
> >> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >> >       at
> >> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >> >       at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> >> >       at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> >> >       at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> >> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> >> >       at
> >>
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> >> >       at
> >> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> >> >       at
> >> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> >> >       at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> >> >       at
> >>
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> >> >       at
> >>
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> >> >       at
> >>
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> >> >       at
> >>
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> >> >       at
> >>
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> >> >       at
> >> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> > Method)
> >> >       at
> >>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> >       at
> >>
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >> >       at
> >>
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >> >       at
> >>
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >> >       at
> >>
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> >> >       at
> >>
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> >> >       at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> >> >       at
> >>
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> >> >       at
> >> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> > Method)
> >> >       at
> >>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> >       at
> >>
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >> >       at
> >>
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >> >       at
> >>
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >> >       at
> >>
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> >> >       at
> >>
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> >> >       at
> >>
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> >> >       at
> >>
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> >> >       at
> >>
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> >> >       at
> >>
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> >> >       at
> >>
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> >> >       at
> >>
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> >> >       at java.base/java.lang.Thread.run(Thread.java:835)
> >> >
> >> >
> >> > On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <vi...@apache.org>
> >> wrote:
> >> >
> >> > > Wow.. Race condition :) ..
> >> > >
> >> > > Thanks for racing , Raymond!
> >> > >
> >> > > On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <
> >> xu.shiyan.raymond@gmail.com>
> >> > > wrote:
> >> > >
> >> > > > Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in
> >> your
> >> > > > classpath.
> >> > > >
> >> > > >
> >> > > >
> >> > > > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <
> jiangok2006@gmail.com>
> >> > > wrote:
> >> > > >
> >> > > > > Thanks Balaji.
> >> > > > >
> >> > > > > My unit test failed due to dependency incompatibility. Any idea
> >> will be
> >> > > > > highly appreciated!
> >> > > > >
> >> > > > >
> >> > > > > The test is copied from hudi quick start:
> >> > > > >
> >> > > > > import org.apache.hudi.QuickstartUtils._
> >> > > > >
> >> > > > > import scala.collection.JavaConversions._
> >> > > > > import org.apache.spark.sql.SaveMode._
> >> > > > > import org.apache.hudi.DataSourceReadOptions._
> >> > > > > import org.apache.hudi.DataSourceWriteOptions._
> >> > > > > import org.apache.hudi.config.HoodieWriteConfig._
> >> > > > >
> >> > > > > class InputOutputTest extends HudiBaseTest{
> >> > > > >
> >> > > > > val config = new SparkConf().setAppName(name)
> >> > > > >   config.set("spark.driver.allowMultipleContexts", "true")
> >> > > > >   config.set("spark.serializer",
> >> > > > > "org.apache.spark.serializer.KryoSerializer")
> >> > > > >   config.setMaster("local[*]").setAppName("Local Test")
> >> > > > >   val executionContext =
> >> > > > > SparkSession.builder().config(config).getOrCreate()
> >> > > > >
> >> > > > > val tableName = "hudi_trips_cow"
> >> > > > >   val basePath = "file:///tmp/hudi_trips_cow"
> >> > > > >   val dataGen = new DataGenerator
> >> > > > >
> >> > > > >   override def beforeAll(): Unit = {
> >> > > > >   }
> >> > > > >
> >> > > > >   test("Can create a hudi dataset") {
> >> > > > >     val inserts =
> convertToStringList(dataGen.generateInserts(10))
> >> > > > >     val df = executionContext.sparkSession.read.json(
> >> > > > >       executionContext.sparkContext.parallelize(inserts, 2))
> >> > > > >
> >> > > > >     df.write.format("hudi").
> >> > > > >       options(getQuickstartWriteConfigs).
> >> > > > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> >> > > > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> >> > > > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> >> > > > >       option(TABLE_NAME, tableName).
> >> > > > >       mode(Overwrite).
> >> > > > >       save(basePath)
> >> > > > >   }
> >> > > > > }
> >> > > > >
> >> > > > >
> >> > > > > The exception is:
> >> > > > >
> >> > > > > java.lang.NoClassDefFoundError:
> >> > > > org/apache/spark/sql/avro/SchemaConverters$
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> >> > > > >         at
> >> > > > >
> >> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> >> > > > >         at
> >> > > > >
> >> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> >> > > > >         at
> >> > > > >
> >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> >> > > > >         at
> >> > > > >
> >> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >> > > > >         at
> >> org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> >> > > > >         at
> org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> >> > > > >         at org.scalatest.Transformer.apply(Transformer.scala:22)
> >> > > > >         at org.scalatest.Transformer.apply(Transformer.scala:20)
> >> > > > >         at
> >> > > > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> >> > > > >         at
> >> > > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> >> > > > >         at
> org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >> > > > >         at
> org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> >> > > > >         at
> >> > > > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> >> > > > >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> >> > > > >         at
> scala.collection.immutable.List.foreach(List.scala:392)
> >> > > > >         at
> >> > > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> >> > > > >         at org.scalatest.SuperEngine.org
> >> > > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> >> > > > >         at
> >> org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> >> > > > >         at
> >> > > > >
> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> >> > > > >         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> >> > > > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> >> > > > >         at org.scalatest.FunSuite.org
> >> > > > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> >> > > > >         at
> >> > > > >
> >> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >> > > > >         at
> >> > > > >
> >> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >> > > > >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> >> > > > >         at
> >> org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> >> > > > >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> >> > > > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> >> > > > >         at
> >> > > > >
> >> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> >> > > > >         at
> >> > > > >
> >> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> >> > > > >         at
> >> org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> >> > > > >         at
> >> > > > >
> >> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> > > > > Method)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> > > > >         at
> >> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> >> > > > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown
> Source)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> >> > > > >         at
> >> > > > >
> >> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> >> > > > > Method)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> > > > >         at
> >> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> >> > > > >         at java.base/java.lang.Thread.run(Thread.java:835)
> >> > > > > Caused by: java.lang.ClassNotFoundException:
> >> > > > > org.apache.spark.sql.avro.SchemaConverters$
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> >> > > > >         at
> >> > > > >
> >> > > >
> >> > >
> >>
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> >> > > > >         at
> >> > > > java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> >> > > > >         ... 91 more
> >> > > > >
> >> > > > >
> >> > > > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> >> > > > > <v....@ymail.com.invalid> wrote:
> >> > > > >
> >> > > > > >  Thanks for using Hudi. Looking at pom definitions between
> >> 0.5.1 and
> >> > > > > > 0.5.2, I don't see any difference that could cause this issue.
> >> As it
> >> > > > > works
> >> > > > > > with 0.5.2, I am assuming you are not blocked. Let us know
> >> otherwise.
> >> > > > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian
> >> Jiang <
> >> > > > > > jiangok2006@gmail.com> wrote:
> >> > > > > >
> >> > > > > >  Thanks Vinoth.
> >> > > > > >
> >> > > > > > Below dependency has no conflict:
> >> > > > > >
> >> > > > > > compile group: 'org.apache.spark', name: 'spark-core_2.11',
> >> version:
> >> > > > > > '2.3.0'
> >> > > > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> >> version:
> >> > > > > '2.3.0'
> >> > > > > > compile group: 'org.scala-lang', name: 'scala-library',
> version:
> >> > > > > '2.11.11'
> >> > > > > > compile group: 'com.github.scopt', name: 'scopt_2.11',
> version:
> >> > > '3.7.1'
> >> > > > > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> >> > > > '1.11.297'
> >> > > > > > compile group: 'org.apache.hudi', name:
> >> 'hudi-spark-bundle_2.11',
> >> > > > > > version: '0.5.2-incubating'
> >> > > > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> >> > > > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> >> version:
> >> > > > > > '3.2.0-SNAP7'
> >> > > > > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> >> > > version:
> >> > > > > > '1.5.12'
> >> > > > > > compile group: 'org.apache.iceberg', name: 'iceberg-api',
> >> version:
> >> > > > > > '0.8.0-incubating'
> >> > > > > >
> >> > > > > > Cheers!
> >> > > > > >
> >> > > > > >
> >> > > > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <
> >> vinoth@apache.org>
> >> > > > > wrote:
> >> > > > > >
> >> > > > > > > Hi Leon,
> >> > > > > > >
> >> > > > > > > Sorry for the late reply.  Seems like a version mismatch for
> >> > > > mockito..
> >> > > > > > > I see you are already trying to exclude it though.. Could
> you
> >> share
> >> > > > the
> >> > > > > > > full stack trace?
> >> > > > > > >
> >> > > > > > >
> >> > > > > > >
> >> > > > > > >
> >> > > > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <
> >> jiangok2006@gmail.com>
> >> > > > > > wrote:
> >> > > > > > >
> >> > > > > > > > Hi,
> >> > > > > > > >
> >> > > > > > > > I am using hudi in a scala gradle project:
> >> > > > > > > >
> >> > > > > > > > dependencies {
> >> > > > > > > >    compile group: 'org.apache.spark', name:
> >> 'spark-core_2.11',
> >> > > > > version:
> >> > > > > > > > '2.4.4'
> >> > > > > > > >    compile group: 'org.apache.spark', name:
> >> 'spark-sql_2.11',
> >> > > > > version:
> >> > > > > > > > '2.4.4'
> >> > > > > > > >    compile group: 'org.scala-lang', name: 'scala-library',
> >> > > version:
> >> > > > > > > > '2.11.11'
> >> > > > > > > >    compile group: 'com.github.scopt', name: 'scopt_2.11',
> >> > > version:
> >> > > > > > > '3.7.1'
> >> > > > > > > >    compile group: 'org.apache.spark', name:
> >> 'spark-avro_2.11',
> >> > > > > version:
> >> > > > > > > > '2.4.4'
> >> > > > > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk',
> >> version:
> >> > > > > > > > '1.11.297'
> >> > > > > > > >    compile group: 'com.zillow.datacontracts', name:
> >> > > > > > > > 'contract-evaluation-library', version:
> >> '0.1.0.master.98a438b'
> >> > > > > > > >    compile (group: 'org.apache.hudi', name:
> >> 'hudi-spark_2.11',
> >> > > > > > > > version: '0.5.1-incubating') {
> >> > > > > > > >        exclude group: 'org.scala-lang', module:
> >> 'scala-library'
> >> > > > > > > >        exclude group: 'org.scalatest', module:
> >> 'scalatest_2.12'
> >> > > > > > > >    }
> >> > > > > > > >
> >> > > > > > > >    testCompile group: 'junit', name: 'junit', version:
> >> '4.12'
> >> > > > > > > >    testCompile group: 'org.scalatest', name:
> >> 'scalatest_2.11',
> >> > > > > > > > version: '3.2.0-SNAP7'
> >> > > > > > > >    testCompile group: 'org.mockito', name:
> >> 'mockito-scala_2.11',
> >> > > > > > > > version: '1.5.12'
> >> > > > > > > > }
> >> > > > > > > >
> >> > > > > > > > Below code throws exception '
> >> > > > > > > > java.lang.NoSuchMethodError:
> >> > > > > > > >
> >> > > > > > > >
> >> > > > > > >
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >>
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> >> > > > > > > >
> >> > > > > > > > import org.junit.runner.RunWith
> >> > > > > > > > import org.scalatest.FunSuite
> >> > > > > > > > import org.scalatest.junit.JUnitRunner
> >> > > > > > > > import org.scalatest.mockito.MockitoSugar
> >> > > > > > > >
> >> > > > > > > > @RunWith(classOf[JUnitRunner])
> >> > > > > > > > class BaseTest extends FunSuite with MockitoSugar {
> >> > > > > > > > }
> >> > > > > > > >
> >> > > > > > > > Removing org.apache.hudi from the dependency list will
> make
> >> the
> >> > > > code
> >> > > > > > > > work. Does anybody know how to include hudi dependency
> >> without
> >> > > > > > > > conflicting with the test?
> >> > > > > > > >
> >> > > > > > > > Appreciate any help!
> >> > > > > > > >
> >> > > > > > > > Regards
> >> > > > > > > >
> >> > > > > > > > Leon
> >> > > > > > > >
> >> > > > > > >
> >> > > > > >
> >> > > > > >
> >> > > > > > --
> >> > > > > >
> >> > > > > > Create your own email signature
> >> > > > > > <
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >>
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >> > > > > > >
> >> > > > > >
> >> > > > >
> >> > > > >
> >> > > > >
> >> > > > > --
> >> > > > >
> >> > > > > Create your own email signature
> >> > > > > <
> >> > > > >
> >> > > >
> >> > >
> >>
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >> >
> >> > --
> >> >
> >> > Create your own email signature
> >> > <
> >>
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >> >
> >> >
> >>
> >
> >
> > --
> >
> > Create your own email signature
> > <
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >
> >
>
>
> --
>
> Create your own email signature
> <
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >
>

Re: hudi dependency conflicts for test

Posted by Lian Jiang <ji...@gmail.com>.
The root cause is that I need to use java 8 instead of the default java 11
in intellij. Thanks everyone for helping and cheers!

On Thu, May 21, 2020 at 1:09 PM Lian Jiang <ji...@gmail.com> wrote:

> The examples in quick start work for me in spark-shell. I am trying to use
> scala unit test to make these examples easier to repeat in CICD given hudi
> is still in incubating.
>
> Below is the new set of dependencies as instructed:
>
> compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.4.5'
> compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.4.5'
> compile group: 'org.apache.spark', name: 'spark-avro_2.11', version: '2.4.4'
> compile group: 'org.scala-lang', name: 'scala-library', version: '2.11.11'
> compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
> compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
> compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11', version: '0.5.2-incubating'
> testCompile group: 'junit', name: 'junit', version: '4.12'
> testCompile group: 'org.scalatest', name: 'scalatest_2.11', version: '3.2.0-SNAP7'
> testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version: '1.5.12'
>
> Yet, there is another version related exception:
>
> java.lang.IllegalArgumentException: Unsupported class file major version 56
> 	at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
> 	at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
> 	at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
> 	at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
> 	at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
> 	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
> 	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
> 	at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
> 	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> 	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
> 	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
> 	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
> 	at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
> 	at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
> 	at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
> 	at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
> 	at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
> 	at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
> 	at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
> 	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
> 	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
> 	at scala.collection.immutable.List.foreach(List.scala:392)
> 	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
> 	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
> 	at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
> 	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
> 	at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1409)
> 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> 	at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
> 	at org.apache.spark.rdd.RDD.take(RDD.scala:1382)
> 	at org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply$mcZ$sp(RDD.scala:1517)
> 	at org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
> 	at org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
> 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
> 	at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
> 	at org.apache.spark.rdd.RDD.isEmpty(RDD.scala:1516)
> 	at org.apache.spark.api.java.JavaRDDLike$class.isEmpty(JavaRDDLike.scala:544)
> 	at org.apache.spark.api.java.AbstractJavaRDDLike.isEmpty(JavaRDDLike.scala:45)
> 	at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:142)
> 	at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> 	at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> 	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> 	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> 	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:83)
> 	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:81)
> 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
> 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
> 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
> 	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> 	at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> 	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> 	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> 	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> 	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> 	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> 	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> 	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> 	at org.scalatest.Transformer.apply(Transformer.scala:22)
> 	at org.scalatest.Transformer.apply(Transformer.scala:20)
> 	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> 	at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> 	at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> 	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> 	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> 	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> 	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> 	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> 	at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> 	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> 	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> 	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> 	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> 	at scala.collection.immutable.List.foreach(List.scala:392)
> 	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> 	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> 	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> 	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> 	at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> 	at org.scalatest.Suite$class.run(Suite.scala:1147)
> 	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> 	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> 	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> 	at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> 	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> 	at com.zillow.dataforce_storage_poc.HudiBaseTest.org$scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> 	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> 	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> 	at com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> 	at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> 	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> 	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> 	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> 	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> 	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> 	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> 	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> 	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> 	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> 	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> 	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> 	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> 	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> 	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> 	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> 	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> 	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> 	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> 	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> 	at java.base/java.lang.Thread.run(Thread.java:835)
>
> Does anyone have a working example for unit test instead of spark-shell? Thanks.
>
>
>
> On Thu, May 21, 2020 at 12:21 PM Lamber-Ken <la...@apache.org> wrote:
>
>> hello jiang,
>>
>> Please try following demo, need spark(>=2.4.4)
>>
>> ------------------------------------------------------
>>
>> export SPARK_HOME=/work/BigData/install/spark/spark-2.4.5-bin-hadoop2.7
>> ${SPARK_HOME}/bin/spark-shell \
>>     --packages
>> org.apache.hudi:hudi-spark-bundle_2.11:0.5.2-incubating,org.apache.spark:spark-avro_2.11:2.4.4
>> \
>>     --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer'
>>
>> import org.apache.spark.sql.functions._
>>
>> val tableName = "hudi_mor_table"
>> val basePath = "file:///tmp/hudi_cow_tablen"
>>
>> val hudiOptions = Map[String,String](
>>   "hoodie.upsert.shuffle.parallelism" -> "10",
>>   "hoodie.datasource.write.recordkey.field" -> "key",
>>   "hoodie.datasource.write.partitionpath.field" -> "dt",
>>   "hoodie.table.name" -> tableName,
>>   "hoodie.datasource.write.precombine.field" -> "timestamp"
>> )
>>
>> val inputDF = spark.range(0, 5).
>>    withColumn("key", $"id").
>>    withColumn("data", lit("data")).
>>    withColumn("timestamp", current_timestamp()).
>>    withColumn("dt", date_format($"timestamp", "yyyy-MM-dd"))
>>
>> inputDF.write.format("org.apache.hudi").
>>   options(hudiOptions).
>>   mode("Overwrite").
>>   save(basePath)
>>
>> spark.read.format("org.apache.hudi").load(basePath + "/*/*").show();
>>
>> ------------------------------------------------------
>>
>> Best,
>> Lamber-Ken
>>
>>
>> On 2020/05/21 18:59:02, Lian Jiang <ji...@gmail.com> wrote:
>> > Thanks Shiyan and Vinoth. Unfortunately, adding
>> > org.apache.spark:spark-avro_2.11:2.4.4 throws another version related
>> > exception:
>> >
>> > java.lang.NoSuchMethodError:
>> >
>> org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
>> >       at
>> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
>> >       at
>> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
>> >       at
>> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
>> >       at scala.collection.Iterator$class.foreach(Iterator.scala:891)
>> >       at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
>> >       at
>> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>> >       at
>> org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
>> >       at
>> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
>> >       at
>> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
>> >       at
>> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
>> >       at
>> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
>> >       at
>> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
>> >       at
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>> >       at
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>> >       at
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
>> >       at
>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>> >       at
>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>> >       at
>> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>> >       at
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>> >       at
>> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>> >       at
>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>> >       at
>> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>> >       at
>> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
>> >       at
>> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>> >       at
>> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>> >       at
>> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
>> >       at
>> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
>> >       at
>> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
>> >       at
>> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
>> >       at
>> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
>> >       at
>> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
>> >       at
>> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
>> >       at
>> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
>> >       at
>> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>> >       at
>> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>> >       at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>> >       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>> >       at org.scalatest.Transformer.apply(Transformer.scala:22)
>> >       at org.scalatest.Transformer.apply(Transformer.scala:20)
>> >       at
>> org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
>> >       at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
>> >       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
>> >       at
>> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
>> >       at
>> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>> >       at
>> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>> >       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
>> >       at
>> org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
>> >       at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
>> >       at
>> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>> >       at
>> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>> >       at
>> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
>> >       at
>> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
>> >       at scala.collection.immutable.List.foreach(List.scala:392)
>> >       at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
>> >       at org.scalatest.SuperEngine.org
>> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
>> >       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
>> >       at
>> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
>> >       at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
>> >       at org.scalatest.Suite$class.run(Suite.scala:1147)
>> >       at org.scalatest.FunSuite.org
>> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
>> >       at
>> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>> >       at
>> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>> >       at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
>> >       at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
>> >       at com.zillow.dataforce_storage_poc.HudiBaseTest.org
>> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
>> >       at
>> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
>> >       at
>> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
>> >       at
>> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
>> >       at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
>> >       at
>> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
>> >       at
>> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
>> >       at
>> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
>> >       at
>> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
>> >       at
>> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
>> >       at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>> > Method)
>> >       at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> >       at
>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
>> >       at
>> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>> >       at
>> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>> >       at
>> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
>> >       at
>> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
>> >       at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
>> >       at
>> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
>> >       at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>> > Method)
>> >       at
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> >       at
>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
>> >       at
>> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>> >       at
>> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>> >       at
>> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
>> >       at
>> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
>> >       at
>> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
>> >       at
>> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
>> >       at
>> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
>> >       at
>> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>> >       at
>> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>> >       at
>> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
>> >       at java.base/java.lang.Thread.run(Thread.java:835)
>> >
>> >
>> > On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <vi...@apache.org>
>> wrote:
>> >
>> > > Wow.. Race condition :) ..
>> > >
>> > > Thanks for racing , Raymond!
>> > >
>> > > On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <
>> xu.shiyan.raymond@gmail.com>
>> > > wrote:
>> > >
>> > > > Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in
>> your
>> > > > classpath.
>> > > >
>> > > >
>> > > >
>> > > > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com>
>> > > wrote:
>> > > >
>> > > > > Thanks Balaji.
>> > > > >
>> > > > > My unit test failed due to dependency incompatibility. Any idea
>> will be
>> > > > > highly appreciated!
>> > > > >
>> > > > >
>> > > > > The test is copied from hudi quick start:
>> > > > >
>> > > > > import org.apache.hudi.QuickstartUtils._
>> > > > >
>> > > > > import scala.collection.JavaConversions._
>> > > > > import org.apache.spark.sql.SaveMode._
>> > > > > import org.apache.hudi.DataSourceReadOptions._
>> > > > > import org.apache.hudi.DataSourceWriteOptions._
>> > > > > import org.apache.hudi.config.HoodieWriteConfig._
>> > > > >
>> > > > > class InputOutputTest extends HudiBaseTest{
>> > > > >
>> > > > > val config = new SparkConf().setAppName(name)
>> > > > >   config.set("spark.driver.allowMultipleContexts", "true")
>> > > > >   config.set("spark.serializer",
>> > > > > "org.apache.spark.serializer.KryoSerializer")
>> > > > >   config.setMaster("local[*]").setAppName("Local Test")
>> > > > >   val executionContext =
>> > > > > SparkSession.builder().config(config).getOrCreate()
>> > > > >
>> > > > > val tableName = "hudi_trips_cow"
>> > > > >   val basePath = "file:///tmp/hudi_trips_cow"
>> > > > >   val dataGen = new DataGenerator
>> > > > >
>> > > > >   override def beforeAll(): Unit = {
>> > > > >   }
>> > > > >
>> > > > >   test("Can create a hudi dataset") {
>> > > > >     val inserts = convertToStringList(dataGen.generateInserts(10))
>> > > > >     val df = executionContext.sparkSession.read.json(
>> > > > >       executionContext.sparkContext.parallelize(inserts, 2))
>> > > > >
>> > > > >     df.write.format("hudi").
>> > > > >       options(getQuickstartWriteConfigs).
>> > > > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
>> > > > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
>> > > > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
>> > > > >       option(TABLE_NAME, tableName).
>> > > > >       mode(Overwrite).
>> > > > >       save(basePath)
>> > > > >   }
>> > > > > }
>> > > > >
>> > > > >
>> > > > > The exception is:
>> > > > >
>> > > > > java.lang.NoClassDefFoundError:
>> > > > org/apache/spark/sql/avro/SchemaConverters$
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
>> > > > >         at
>> > > > >
>> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>> > > > >         at
>> > > > >
>> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
>> > > > >         at
>> > > > >
>> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
>> > > > >         at
>> > > > >
>> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>> > > > >         at
>> org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>> > > > >         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>> > > > >         at org.scalatest.Transformer.apply(Transformer.scala:22)
>> > > > >         at org.scalatest.Transformer.apply(Transformer.scala:20)
>> > > > >         at
>> > > > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
>> > > > >         at
>> > > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
>> > > > >         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>> > > > >         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
>> > > > >         at
>> > > > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
>> > > > >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
>> > > > >         at scala.collection.immutable.List.foreach(List.scala:392)
>> > > > >         at
>> > > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
>> > > > >         at org.scalatest.SuperEngine.org
>> > > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
>> > > > >         at
>> org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
>> > > > >         at
>> > > > > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
>> > > > >         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
>> > > > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
>> > > > >         at org.scalatest.FunSuite.org
>> > > > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
>> > > > >         at
>> > > > >
>> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>> > > > >         at
>> > > > >
>> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>> > > > >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
>> > > > >         at
>> org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
>> > > > >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
>> > > > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
>> > > > >         at
>> > > > >
>> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
>> > > > >         at
>> > > > >
>> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
>> > > > >         at
>> org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
>> > > > >         at
>> > > > >
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>> > > > > Method)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > > >         at
>> java.base/java.lang.reflect.Method.invoke(Method.java:567)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
>> > > > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
>> > > > >         at
>> > > > >
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>> > > > > Method)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > > >         at
>> java.base/java.lang.reflect.Method.invoke(Method.java:567)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
>> > > > >         at java.base/java.lang.Thread.run(Thread.java:835)
>> > > > > Caused by: java.lang.ClassNotFoundException:
>> > > > > org.apache.spark.sql.avro.SchemaConverters$
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
>> > > > >         at
>> > > > >
>> > > >
>> > >
>> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>> > > > >         at
>> > > > java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
>> > > > >         ... 91 more
>> > > > >
>> > > > >
>> > > > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
>> > > > > <v....@ymail.com.invalid> wrote:
>> > > > >
>> > > > > >  Thanks for using Hudi. Looking at pom definitions between
>> 0.5.1 and
>> > > > > > 0.5.2, I don't see any difference that could cause this issue.
>> As it
>> > > > > works
>> > > > > > with 0.5.2, I am assuming you are not blocked. Let us know
>> otherwise.
>> > > > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian
>> Jiang <
>> > > > > > jiangok2006@gmail.com> wrote:
>> > > > > >
>> > > > > >  Thanks Vinoth.
>> > > > > >
>> > > > > > Below dependency has no conflict:
>> > > > > >
>> > > > > > compile group: 'org.apache.spark', name: 'spark-core_2.11',
>> version:
>> > > > > > '2.3.0'
>> > > > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11',
>> version:
>> > > > > '2.3.0'
>> > > > > > compile group: 'org.scala-lang', name: 'scala-library', version:
>> > > > > '2.11.11'
>> > > > > > compile group: 'com.github.scopt', name: 'scopt_2.11', version:
>> > > '3.7.1'
>> > > > > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
>> > > > '1.11.297'
>> > > > > > compile group: 'org.apache.hudi', name:
>> 'hudi-spark-bundle_2.11',
>> > > > > > version: '0.5.2-incubating'
>> > > > > > testCompile group: 'junit', name: 'junit', version: '4.12'
>> > > > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11',
>> version:
>> > > > > > '3.2.0-SNAP7'
>> > > > > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
>> > > version:
>> > > > > > '1.5.12'
>> > > > > > compile group: 'org.apache.iceberg', name: 'iceberg-api',
>> version:
>> > > > > > '0.8.0-incubating'
>> > > > > >
>> > > > > > Cheers!
>> > > > > >
>> > > > > >
>> > > > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <
>> vinoth@apache.org>
>> > > > > wrote:
>> > > > > >
>> > > > > > > Hi Leon,
>> > > > > > >
>> > > > > > > Sorry for the late reply.  Seems like a version mismatch for
>> > > > mockito..
>> > > > > > > I see you are already trying to exclude it though.. Could you
>> share
>> > > > the
>> > > > > > > full stack trace?
>> > > > > > >
>> > > > > > >
>> > > > > > >
>> > > > > > >
>> > > > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <
>> jiangok2006@gmail.com>
>> > > > > > wrote:
>> > > > > > >
>> > > > > > > > Hi,
>> > > > > > > >
>> > > > > > > > I am using hudi in a scala gradle project:
>> > > > > > > >
>> > > > > > > > dependencies {
>> > > > > > > >    compile group: 'org.apache.spark', name:
>> 'spark-core_2.11',
>> > > > > version:
>> > > > > > > > '2.4.4'
>> > > > > > > >    compile group: 'org.apache.spark', name:
>> 'spark-sql_2.11',
>> > > > > version:
>> > > > > > > > '2.4.4'
>> > > > > > > >    compile group: 'org.scala-lang', name: 'scala-library',
>> > > version:
>> > > > > > > > '2.11.11'
>> > > > > > > >    compile group: 'com.github.scopt', name: 'scopt_2.11',
>> > > version:
>> > > > > > > '3.7.1'
>> > > > > > > >    compile group: 'org.apache.spark', name:
>> 'spark-avro_2.11',
>> > > > > version:
>> > > > > > > > '2.4.4'
>> > > > > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk',
>> version:
>> > > > > > > > '1.11.297'
>> > > > > > > >    compile group: 'com.zillow.datacontracts', name:
>> > > > > > > > 'contract-evaluation-library', version:
>> '0.1.0.master.98a438b'
>> > > > > > > >    compile (group: 'org.apache.hudi', name:
>> 'hudi-spark_2.11',
>> > > > > > > > version: '0.5.1-incubating') {
>> > > > > > > >        exclude group: 'org.scala-lang', module:
>> 'scala-library'
>> > > > > > > >        exclude group: 'org.scalatest', module:
>> 'scalatest_2.12'
>> > > > > > > >    }
>> > > > > > > >
>> > > > > > > >    testCompile group: 'junit', name: 'junit', version:
>> '4.12'
>> > > > > > > >    testCompile group: 'org.scalatest', name:
>> 'scalatest_2.11',
>> > > > > > > > version: '3.2.0-SNAP7'
>> > > > > > > >    testCompile group: 'org.mockito', name:
>> 'mockito-scala_2.11',
>> > > > > > > > version: '1.5.12'
>> > > > > > > > }
>> > > > > > > >
>> > > > > > > > Below code throws exception '
>> > > > > > > > java.lang.NoSuchMethodError:
>> > > > > > > >
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
>> > > > > > > >
>> > > > > > > > import org.junit.runner.RunWith
>> > > > > > > > import org.scalatest.FunSuite
>> > > > > > > > import org.scalatest.junit.JUnitRunner
>> > > > > > > > import org.scalatest.mockito.MockitoSugar
>> > > > > > > >
>> > > > > > > > @RunWith(classOf[JUnitRunner])
>> > > > > > > > class BaseTest extends FunSuite with MockitoSugar {
>> > > > > > > > }
>> > > > > > > >
>> > > > > > > > Removing org.apache.hudi from the dependency list will make
>> the
>> > > > code
>> > > > > > > > work. Does anybody know how to include hudi dependency
>> without
>> > > > > > > > conflicting with the test?
>> > > > > > > >
>> > > > > > > > Appreciate any help!
>> > > > > > > >
>> > > > > > > > Regards
>> > > > > > > >
>> > > > > > > > Leon
>> > > > > > > >
>> > > > > > >
>> > > > > >
>> > > > > >
>> > > > > > --
>> > > > > >
>> > > > > > Create your own email signature
>> > > > > > <
>> > > > > >
>> > > > >
>> > > >
>> > >
>> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
>> > > > > > >
>> > > > > >
>> > > > >
>> > > > >
>> > > > >
>> > > > > --
>> > > > >
>> > > > > Create your own email signature
>> > > > > <
>> > > > >
>> > > >
>> > >
>> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> >
>> > --
>> >
>> > Create your own email signature
>> > <
>> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
>> >
>> >
>>
>
>
> --
>
> Create your own email signature
> <https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>
>


-- 

Create your own email signature
<https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>

Re: hudi dependency conflicts for test

Posted by Lian Jiang <ji...@gmail.com>.
The examples in quick start work for me in spark-shell. I am trying to use
scala unit test to make these examples easier to repeat in CICD given hudi
is still in incubating.

Below is the new set of dependencies as instructed:

compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.4.5'
compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.4.5'
compile group: 'org.apache.spark', name: 'spark-avro_2.11', version: '2.4.4'
compile group: 'org.scala-lang', name: 'scala-library', version: '2.11.11'
compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
version: '0.5.2-incubating'
testCompile group: 'junit', name: 'junit', version: '4.12'
testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
'3.2.0-SNAP7'
testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version: '1.5.12'

Yet, there is another version related exception:

java.lang.IllegalArgumentException: Unsupported class file major version 56
	at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
	at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
	at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
	at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
	at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
	at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
	at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
	at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
	at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
	at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
	at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
	at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
	at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
	at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
	at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1409)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
	at org.apache.spark.rdd.RDD.take(RDD.scala:1382)
	at org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply$mcZ$sp(RDD.scala:1517)
	at org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
	at org.apache.spark.rdd.RDD$$anonfun$isEmpty$1.apply(RDD.scala:1517)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
	at org.apache.spark.rdd.RDD.isEmpty(RDD.scala:1516)
	at org.apache.spark.api.java.JavaRDDLike$class.isEmpty(JavaRDDLike.scala:544)
	at org.apache.spark.api.java.AbstractJavaRDDLike.isEmpty(JavaRDDLike.scala:45)
	at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:142)
	at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
	at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:83)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:81)
	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:80)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:127)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:75)
	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
	at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
	at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
	at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
	at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
	at org.scalatest.Suite$class.run(Suite.scala:1147)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
	at com.zillow.dataforce_storage_poc.HudiBaseTest.org$scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
	at com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
	at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
	at java.base/java.lang.Thread.run(Thread.java:835)

Does anyone have a working example for unit test instead of spark-shell? Thanks.



On Thu, May 21, 2020 at 12:21 PM Lamber-Ken <la...@apache.org> wrote:

> hello jiang,
>
> Please try following demo, need spark(>=2.4.4)
>
> ------------------------------------------------------
>
> export SPARK_HOME=/work/BigData/install/spark/spark-2.4.5-bin-hadoop2.7
> ${SPARK_HOME}/bin/spark-shell \
>     --packages
> org.apache.hudi:hudi-spark-bundle_2.11:0.5.2-incubating,org.apache.spark:spark-avro_2.11:2.4.4
> \
>     --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer'
>
> import org.apache.spark.sql.functions._
>
> val tableName = "hudi_mor_table"
> val basePath = "file:///tmp/hudi_cow_tablen"
>
> val hudiOptions = Map[String,String](
>   "hoodie.upsert.shuffle.parallelism" -> "10",
>   "hoodie.datasource.write.recordkey.field" -> "key",
>   "hoodie.datasource.write.partitionpath.field" -> "dt",
>   "hoodie.table.name" -> tableName,
>   "hoodie.datasource.write.precombine.field" -> "timestamp"
> )
>
> val inputDF = spark.range(0, 5).
>    withColumn("key", $"id").
>    withColumn("data", lit("data")).
>    withColumn("timestamp", current_timestamp()).
>    withColumn("dt", date_format($"timestamp", "yyyy-MM-dd"))
>
> inputDF.write.format("org.apache.hudi").
>   options(hudiOptions).
>   mode("Overwrite").
>   save(basePath)
>
> spark.read.format("org.apache.hudi").load(basePath + "/*/*").show();
>
> ------------------------------------------------------
>
> Best,
> Lamber-Ken
>
>
> On 2020/05/21 18:59:02, Lian Jiang <ji...@gmail.com> wrote:
> > Thanks Shiyan and Vinoth. Unfortunately, adding
> > org.apache.spark:spark-avro_2.11:2.4.4 throws another version related
> > exception:
> >
> > java.lang.NoSuchMethodError:
> >
> org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
> >       at
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
> >       at
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
> >       at
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
> >       at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> >       at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> >       at
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> >       at
> org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
> >       at
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
> >       at
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> >       at
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> >       at
> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> >       at
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> >       at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> >       at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> >       at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> >       at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> >       at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> >       at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> >       at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> >       at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> >       at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> >       at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> >       at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> >       at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >       at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >       at
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> >       at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> >       at
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> >       at
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> >       at
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> >       at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> >       at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> >       at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> >       at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >       at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >       at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> >       at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> >       at org.scalatest.Transformer.apply(Transformer.scala:22)
> >       at org.scalatest.Transformer.apply(Transformer.scala:20)
> >       at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> >       at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> >       at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> >       at
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >       at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> >       at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> >       at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >       at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> >       at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> >       at scala.collection.immutable.List.foreach(List.scala:392)
> >       at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> >       at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> >       at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> >       at
> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> >       at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> >       at org.scalatest.Suite$class.run(Suite.scala:1147)
> >       at org.scalatest.FunSuite.org
> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >       at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >       at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> >       at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> >       at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> >       at
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> >       at
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> >       at
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> >       at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> >       at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> >       at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> >       at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> >       at
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> >       at
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> >       at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> >       at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >       at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >       at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >       at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >       at
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> >       at
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> >       at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> >       at
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> >       at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> >       at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >       at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >       at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >       at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >       at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >       at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> >       at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> >       at
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> >       at
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> >       at
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> >       at
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> >       at
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> >       at
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> >       at java.base/java.lang.Thread.run(Thread.java:835)
> >
> >
> > On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <vi...@apache.org>
> wrote:
> >
> > > Wow.. Race condition :) ..
> > >
> > > Thanks for racing , Raymond!
> > >
> > > On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <
> xu.shiyan.raymond@gmail.com>
> > > wrote:
> > >
> > > > Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in
> your
> > > > classpath.
> > > >
> > > >
> > > >
> > > > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com>
> > > wrote:
> > > >
> > > > > Thanks Balaji.
> > > > >
> > > > > My unit test failed due to dependency incompatibility. Any idea
> will be
> > > > > highly appreciated!
> > > > >
> > > > >
> > > > > The test is copied from hudi quick start:
> > > > >
> > > > > import org.apache.hudi.QuickstartUtils._
> > > > >
> > > > > import scala.collection.JavaConversions._
> > > > > import org.apache.spark.sql.SaveMode._
> > > > > import org.apache.hudi.DataSourceReadOptions._
> > > > > import org.apache.hudi.DataSourceWriteOptions._
> > > > > import org.apache.hudi.config.HoodieWriteConfig._
> > > > >
> > > > > class InputOutputTest extends HudiBaseTest{
> > > > >
> > > > > val config = new SparkConf().setAppName(name)
> > > > >   config.set("spark.driver.allowMultipleContexts", "true")
> > > > >   config.set("spark.serializer",
> > > > > "org.apache.spark.serializer.KryoSerializer")
> > > > >   config.setMaster("local[*]").setAppName("Local Test")
> > > > >   val executionContext =
> > > > > SparkSession.builder().config(config).getOrCreate()
> > > > >
> > > > > val tableName = "hudi_trips_cow"
> > > > >   val basePath = "file:///tmp/hudi_trips_cow"
> > > > >   val dataGen = new DataGenerator
> > > > >
> > > > >   override def beforeAll(): Unit = {
> > > > >   }
> > > > >
> > > > >   test("Can create a hudi dataset") {
> > > > >     val inserts = convertToStringList(dataGen.generateInserts(10))
> > > > >     val df = executionContext.sparkSession.read.json(
> > > > >       executionContext.sparkContext.parallelize(inserts, 2))
> > > > >
> > > > >     df.write.format("hudi").
> > > > >       options(getQuickstartWriteConfigs).
> > > > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> > > > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> > > > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> > > > >       option(TABLE_NAME, tableName).
> > > > >       mode(Overwrite).
> > > > >       save(basePath)
> > > > >   }
> > > > > }
> > > > >
> > > > >
> > > > > The exception is:
> > > > >
> > > > > java.lang.NoClassDefFoundError:
> > > > org/apache/spark/sql/avro/SchemaConverters$
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > > > >         at
> > > > >
> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > > > >         at
> > > > >
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > > > >         at
> > > > >
> > > >
> > >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > > > >         at
> > > > >
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > > > >         at
> > > > >
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > > > >         at
> > > > >
> > > >
> > >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > > > >         at
> > > > >
> > > >
> > >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > > >         at
> > > > >
> > > >
> > >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > > >         at
> org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > > > >         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > > > >         at org.scalatest.Transformer.apply(Transformer.scala:22)
> > > > >         at org.scalatest.Transformer.apply(Transformer.scala:20)
> > > > >         at
> > > > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > > > >         at
> > > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > > > >         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > > > >         at
> > > > >
> > > >
> > >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > > > >         at
> > > > >
> > > >
> > >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > > >         at
> > > > >
> > > >
> > >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > > >         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > > > >         at
> > > > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > > > >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > > > >         at
> > > > >
> > > >
> > >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > > >         at
> > > > >
> > > >
> > >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > > >         at
> > > > >
> > > >
> > >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > > > >         at
> > > > >
> > > >
> > >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > > > >         at scala.collection.immutable.List.foreach(List.scala:392)
> > > > >         at
> > > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > > > >         at org.scalatest.SuperEngine.org
> > > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > > > >         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > > > >         at
> > > > > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > > > >         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > > > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> > > > >         at org.scalatest.FunSuite.org
> > > > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > > > >         at
> > > > >
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > > >         at
> > > > >
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > > >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > > > >         at
> org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > > > >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > > > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > > > >         at
> > > > >
> > > >
> > >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > > > >         at
> > > > >
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > > > >         at
> > > > >
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > > > >         at
> org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > > > >         at
> > > > >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > > > Method)
> > > > >         at
> > > > >
> > > >
> > >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > > >         at
> > > > >
> > > >
> > >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > > >         at
> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > > > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > > > >         at
> > > > >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > > > Method)
> > > > >         at
> > > > >
> > > >
> > >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > > >         at
> > > > >
> > > >
> > >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > > >         at
> java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > > > >         at
> > > > >
> > > >
> > >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > > > >         at
> > > > >
> > > >
> > >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > > > >         at
> > > > >
> > > >
> > >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > > > >         at java.base/java.lang.Thread.run(Thread.java:835)
> > > > > Caused by: java.lang.ClassNotFoundException:
> > > > > org.apache.spark.sql.avro.SchemaConverters$
> > > > >         at
> > > > >
> > > >
> > >
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> > > > >         at
> > > > >
> > > >
> > >
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> > > > >         at
> > > > java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> > > > >         ... 91 more
> > > > >
> > > > >
> > > > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> > > > > <v....@ymail.com.invalid> wrote:
> > > > >
> > > > > >  Thanks for using Hudi. Looking at pom definitions between 0.5.1
> and
> > > > > > 0.5.2, I don't see any difference that could cause this issue.
> As it
> > > > > works
> > > > > > with 0.5.2, I am assuming you are not blocked. Let us know
> otherwise.
> > > > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian
> Jiang <
> > > > > > jiangok2006@gmail.com> wrote:
> > > > > >
> > > > > >  Thanks Vinoth.
> > > > > >
> > > > > > Below dependency has no conflict:
> > > > > >
> > > > > > compile group: 'org.apache.spark', name: 'spark-core_2.11',
> version:
> > > > > > '2.3.0'
> > > > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> version:
> > > > > '2.3.0'
> > > > > > compile group: 'org.scala-lang', name: 'scala-library', version:
> > > > > '2.11.11'
> > > > > > compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> > > '3.7.1'
> > > > > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > > '1.11.297'
> > > > > > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > > > > > version: '0.5.2-incubating'
> > > > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> version:
> > > > > > '3.2.0-SNAP7'
> > > > > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > > version:
> > > > > > '1.5.12'
> > > > > > compile group: 'org.apache.iceberg', name: 'iceberg-api',
> version:
> > > > > > '0.8.0-incubating'
> > > > > >
> > > > > > Cheers!
> > > > > >
> > > > > >
> > > > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <
> vinoth@apache.org>
> > > > > wrote:
> > > > > >
> > > > > > > Hi Leon,
> > > > > > >
> > > > > > > Sorry for the late reply.  Seems like a version mismatch for
> > > > mockito..
> > > > > > > I see you are already trying to exclude it though.. Could you
> share
> > > > the
> > > > > > > full stack trace?
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <
> jiangok2006@gmail.com>
> > > > > > wrote:
> > > > > > >
> > > > > > > > Hi,
> > > > > > > >
> > > > > > > > I am using hudi in a scala gradle project:
> > > > > > > >
> > > > > > > > dependencies {
> > > > > > > >    compile group: 'org.apache.spark', name:
> 'spark-core_2.11',
> > > > > version:
> > > > > > > > '2.4.4'
> > > > > > > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> > > > > version:
> > > > > > > > '2.4.4'
> > > > > > > >    compile group: 'org.scala-lang', name: 'scala-library',
> > > version:
> > > > > > > > '2.11.11'
> > > > > > > >    compile group: 'com.github.scopt', name: 'scopt_2.11',
> > > version:
> > > > > > > '3.7.1'
> > > > > > > >    compile group: 'org.apache.spark', name:
> 'spark-avro_2.11',
> > > > > version:
> > > > > > > > '2.4.4'
> > > > > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk',
> version:
> > > > > > > > '1.11.297'
> > > > > > > >    compile group: 'com.zillow.datacontracts', name:
> > > > > > > > 'contract-evaluation-library', version:
> '0.1.0.master.98a438b'
> > > > > > > >    compile (group: 'org.apache.hudi', name:
> 'hudi-spark_2.11',
> > > > > > > > version: '0.5.1-incubating') {
> > > > > > > >        exclude group: 'org.scala-lang', module:
> 'scala-library'
> > > > > > > >        exclude group: 'org.scalatest', module:
> 'scalatest_2.12'
> > > > > > > >    }
> > > > > > > >
> > > > > > > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > > > > >    testCompile group: 'org.scalatest', name:
> 'scalatest_2.11',
> > > > > > > > version: '3.2.0-SNAP7'
> > > > > > > >    testCompile group: 'org.mockito', name:
> 'mockito-scala_2.11',
> > > > > > > > version: '1.5.12'
> > > > > > > > }
> > > > > > > >
> > > > > > > > Below code throws exception '
> > > > > > > > java.lang.NoSuchMethodError:
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > > > > > >
> > > > > > > > import org.junit.runner.RunWith
> > > > > > > > import org.scalatest.FunSuite
> > > > > > > > import org.scalatest.junit.JUnitRunner
> > > > > > > > import org.scalatest.mockito.MockitoSugar
> > > > > > > >
> > > > > > > > @RunWith(classOf[JUnitRunner])
> > > > > > > > class BaseTest extends FunSuite with MockitoSugar {
> > > > > > > > }
> > > > > > > >
> > > > > > > > Removing org.apache.hudi from the dependency list will make
> the
> > > > code
> > > > > > > > work. Does anybody know how to include hudi dependency
> without
> > > > > > > > conflicting with the test?
> > > > > > > >
> > > > > > > > Appreciate any help!
> > > > > > > >
> > > > > > > > Regards
> > > > > > > >
> > > > > > > > Leon
> > > > > > > >
> > > > > > >
> > > > > >
> > > > > >
> > > > > > --
> > > > > >
> > > > > > Create your own email signature
> > > > > > <
> > > > > >
> > > > >
> > > >
> > >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > > > > >
> > > > > >
> > > > >
> > > > >
> > > > >
> > > > > --
> > > > >
> > > > > Create your own email signature
> > > > > <
> > > > >
> > > >
> > >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > > > >
> > > > >
> > > >
> > >
> >
> >
> > --
> >
> > Create your own email signature
> > <
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >
> >
>


-- 

Create your own email signature
<https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>

Re: hudi dependency conflicts for test

Posted by Lamber-Ken <la...@apache.org>.
hello jiang,

Please try following demo, need spark(>=2.4.4)

------------------------------------------------------

export SPARK_HOME=/work/BigData/install/spark/spark-2.4.5-bin-hadoop2.7
${SPARK_HOME}/bin/spark-shell \
    --packages org.apache.hudi:hudi-spark-bundle_2.11:0.5.2-incubating,org.apache.spark:spark-avro_2.11:2.4.4 \
    --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer'

import org.apache.spark.sql.functions._

val tableName = "hudi_mor_table"
val basePath = "file:///tmp/hudi_cow_tablen"

val hudiOptions = Map[String,String](
  "hoodie.upsert.shuffle.parallelism" -> "10",
  "hoodie.datasource.write.recordkey.field" -> "key",
  "hoodie.datasource.write.partitionpath.field" -> "dt", 
  "hoodie.table.name" -> tableName,
  "hoodie.datasource.write.precombine.field" -> "timestamp"
)

val inputDF = spark.range(0, 5).
   withColumn("key", $"id").
   withColumn("data", lit("data")).
   withColumn("timestamp", current_timestamp()).
   withColumn("dt", date_format($"timestamp", "yyyy-MM-dd"))

inputDF.write.format("org.apache.hudi").
  options(hudiOptions).
  mode("Overwrite").
  save(basePath)

spark.read.format("org.apache.hudi").load(basePath + "/*/*").show();

------------------------------------------------------

Best,
Lamber-Ken


On 2020/05/21 18:59:02, Lian Jiang <ji...@gmail.com> wrote: 
> Thanks Shiyan and Vinoth. Unfortunately, adding
> org.apache.spark:spark-avro_2.11:2.4.4 throws another version related
> exception:
> 
> java.lang.NoSuchMethodError:
> org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
> 	at org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
> 	at org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
> 	at org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> 	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> 	at org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
> 	at org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
> 	at org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> 	at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> 	at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> 	at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> 	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> 	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> 	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> 	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> 	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> 	at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> 	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> 	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> 	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> 	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> 	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> 	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> 	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> 	at org.scalatest.Transformer.apply(Transformer.scala:22)
> 	at org.scalatest.Transformer.apply(Transformer.scala:20)
> 	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> 	at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> 	at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> 	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> 	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> 	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> 	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> 	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> 	at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> 	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> 	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> 	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> 	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> 	at scala.collection.immutable.List.foreach(List.scala:392)
> 	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> 	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> 	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> 	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> 	at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> 	at org.scalatest.Suite$class.run(Suite.scala:1147)
> 	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> 	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> 	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> 	at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> 	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> 	at com.zillow.dataforce_storage_poc.HudiBaseTest.org$scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> 	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> 	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> 	at com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> 	at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> 	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> 	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> 	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> 	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> 	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> 	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> 	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> 	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> 	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> 	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> 	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> 	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> 	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> 	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> 	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> 	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> 	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> 	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> 	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> 	at java.base/java.lang.Thread.run(Thread.java:835)
> 
> 
> On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <vi...@apache.org> wrote:
> 
> > Wow.. Race condition :) ..
> >
> > Thanks for racing , Raymond!
> >
> > On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <xu...@gmail.com>
> > wrote:
> >
> > > Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in your
> > > classpath.
> > >
> > >
> > >
> > > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com>
> > wrote:
> > >
> > > > Thanks Balaji.
> > > >
> > > > My unit test failed due to dependency incompatibility. Any idea will be
> > > > highly appreciated!
> > > >
> > > >
> > > > The test is copied from hudi quick start:
> > > >
> > > > import org.apache.hudi.QuickstartUtils._
> > > >
> > > > import scala.collection.JavaConversions._
> > > > import org.apache.spark.sql.SaveMode._
> > > > import org.apache.hudi.DataSourceReadOptions._
> > > > import org.apache.hudi.DataSourceWriteOptions._
> > > > import org.apache.hudi.config.HoodieWriteConfig._
> > > >
> > > > class InputOutputTest extends HudiBaseTest{
> > > >
> > > > val config = new SparkConf().setAppName(name)
> > > >   config.set("spark.driver.allowMultipleContexts", "true")
> > > >   config.set("spark.serializer",
> > > > "org.apache.spark.serializer.KryoSerializer")
> > > >   config.setMaster("local[*]").setAppName("Local Test")
> > > >   val executionContext =
> > > > SparkSession.builder().config(config).getOrCreate()
> > > >
> > > > val tableName = "hudi_trips_cow"
> > > >   val basePath = "file:///tmp/hudi_trips_cow"
> > > >   val dataGen = new DataGenerator
> > > >
> > > >   override def beforeAll(): Unit = {
> > > >   }
> > > >
> > > >   test("Can create a hudi dataset") {
> > > >     val inserts = convertToStringList(dataGen.generateInserts(10))
> > > >     val df = executionContext.sparkSession.read.json(
> > > >       executionContext.sparkContext.parallelize(inserts, 2))
> > > >
> > > >     df.write.format("hudi").
> > > >       options(getQuickstartWriteConfigs).
> > > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> > > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> > > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> > > >       option(TABLE_NAME, tableName).
> > > >       mode(Overwrite).
> > > >       save(basePath)
> > > >   }
> > > > }
> > > >
> > > >
> > > > The exception is:
> > > >
> > > > java.lang.NoClassDefFoundError:
> > > org/apache/spark/sql/avro/SchemaConverters$
> > > >         at
> > > >
> > >
> > org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > > >         at
> > > >
> > >
> > org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > > >         at
> > > > org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > > >         at
> > > >
> > >
> > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > > >         at
> > > > org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > > >         at
> > > >
> > >
> > org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > > >         at
> > > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > > >         at
> > > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > > >         at
> > > >
> > >
> > com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > > >         at
> > > >
> > >
> > com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >         at
> > > >
> > >
> > com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >         at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > > >         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > > >         at org.scalatest.Transformer.apply(Transformer.scala:22)
> > > >         at org.scalatest.Transformer.apply(Transformer.scala:20)
> > > >         at
> > > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > > >         at
> > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > > >         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > > >         at
> > > >
> > >
> > org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > > >         at
> > > >
> > >
> > org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >         at
> > > >
> > >
> > org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > > >         at
> > > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > > >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > > >         at
> > > >
> > >
> > org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >         at
> > > >
> > >
> > org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >         at
> > > >
> > >
> > org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > > >         at
> > > >
> > >
> > org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > > >         at scala.collection.immutable.List.foreach(List.scala:392)
> > > >         at
> > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > > >         at org.scalatest.SuperEngine.org
> > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > > >         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > > >         at
> > > > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > > >         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> > > >         at org.scalatest.FunSuite.org
> > > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > > >         at
> > > > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >         at
> > > > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > > >         at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > > >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > > >         at
> > > >
> > >
> > org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > > >         at
> > > > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > > >         at
> > > > com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > > >         at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > > >         at
> > > >
> > >
> > org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > > >         at
> > > >
> > >
> > org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > > >         at
> > > >
> > >
> > org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > > >         at
> > > >
> > >
> > org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > > >         at
> > > >
> > >
> > org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > > >         at
> > > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > > Method)
> > > >         at
> > > >
> > >
> > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >         at
> > > >
> > >
> > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >         at
> > > >
> > >
> > org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >         at
> > > >
> > >
> > org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >         at
> > > >
> > >
> > org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > > >         at
> > > >
> > >
> > org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > > >         at
> > > >
> > >
> > org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > > >         at
> > > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > > Method)
> > > >         at
> > > >
> > >
> > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >         at
> > > >
> > >
> > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >         at
> > > >
> > >
> > org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >         at
> > > >
> > >
> > org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >         at
> > > >
> > >
> > org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > > >         at
> > > >
> > >
> > org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > > >         at
> > > >
> > >
> > org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > > >         at
> > > >
> > >
> > org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > > >         at
> > > >
> > >
> > org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > > >         at
> > > >
> > >
> > java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > > >         at
> > > >
> > >
> > java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > > >         at
> > > >
> > >
> > org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > > >         at java.base/java.lang.Thread.run(Thread.java:835)
> > > > Caused by: java.lang.ClassNotFoundException:
> > > > org.apache.spark.sql.avro.SchemaConverters$
> > > >         at
> > > >
> > >
> > java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> > > >         at
> > > >
> > >
> > java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> > > >         at
> > > java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> > > >         ... 91 more
> > > >
> > > >
> > > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> > > > <v....@ymail.com.invalid> wrote:
> > > >
> > > > >  Thanks for using Hudi. Looking at pom definitions between 0.5.1 and
> > > > > 0.5.2, I don't see any difference that could cause this issue. As it
> > > > works
> > > > > with 0.5.2, I am assuming you are not blocked. Let us know otherwise.
> > > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian Jiang <
> > > > > jiangok2006@gmail.com> wrote:
> > > > >
> > > > >  Thanks Vinoth.
> > > > >
> > > > > Below dependency has no conflict:
> > > > >
> > > > > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > > > > '2.3.0'
> > > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > > > '2.3.0'
> > > > > compile group: 'org.scala-lang', name: 'scala-library', version:
> > > > '2.11.11'
> > > > > compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> > '3.7.1'
> > > > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > '1.11.297'
> > > > > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > > > > version: '0.5.2-incubating'
> > > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> > > > > '3.2.0-SNAP7'
> > > > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > version:
> > > > > '1.5.12'
> > > > > compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
> > > > > '0.8.0-incubating'
> > > > >
> > > > > Cheers!
> > > > >
> > > > >
> > > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org>
> > > > wrote:
> > > > >
> > > > > > Hi Leon,
> > > > > >
> > > > > > Sorry for the late reply.  Seems like a version mismatch for
> > > mockito..
> > > > > > I see you are already trying to exclude it though.. Could you share
> > > the
> > > > > > full stack trace?
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com>
> > > > > wrote:
> > > > > >
> > > > > > > Hi,
> > > > > > >
> > > > > > > I am using hudi in a scala gradle project:
> > > > > > >
> > > > > > > dependencies {
> > > > > > >    compile group: 'org.apache.spark', name: 'spark-core_2.11',
> > > > version:
> > > > > > > '2.4.4'
> > > > > > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> > > > version:
> > > > > > > '2.4.4'
> > > > > > >    compile group: 'org.scala-lang', name: 'scala-library',
> > version:
> > > > > > > '2.11.11'
> > > > > > >    compile group: 'com.github.scopt', name: 'scopt_2.11',
> > version:
> > > > > > '3.7.1'
> > > > > > >    compile group: 'org.apache.spark', name: 'spark-avro_2.11',
> > > > version:
> > > > > > > '2.4.4'
> > > > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > > > > > '1.11.297'
> > > > > > >    compile group: 'com.zillow.datacontracts', name:
> > > > > > > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> > > > > > >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > > > > > > version: '0.5.1-incubating') {
> > > > > > >        exclude group: 'org.scala-lang', module: 'scala-library'
> > > > > > >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> > > > > > >    }
> > > > > > >
> > > > > > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > > > >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > > > > > > version: '3.2.0-SNAP7'
> > > > > > >    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > > > > > > version: '1.5.12'
> > > > > > > }
> > > > > > >
> > > > > > > Below code throws exception '
> > > > > > > java.lang.NoSuchMethodError:
> > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> > org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > > > > >
> > > > > > > import org.junit.runner.RunWith
> > > > > > > import org.scalatest.FunSuite
> > > > > > > import org.scalatest.junit.JUnitRunner
> > > > > > > import org.scalatest.mockito.MockitoSugar
> > > > > > >
> > > > > > > @RunWith(classOf[JUnitRunner])
> > > > > > > class BaseTest extends FunSuite with MockitoSugar {
> > > > > > > }
> > > > > > >
> > > > > > > Removing org.apache.hudi from the dependency list will make the
> > > code
> > > > > > > work. Does anybody know how to include hudi dependency without
> > > > > > > conflicting with the test?
> > > > > > >
> > > > > > > Appreciate any help!
> > > > > > >
> > > > > > > Regards
> > > > > > >
> > > > > > > Leon
> > > > > > >
> > > > > >
> > > > >
> > > > >
> > > > > --
> > > > >
> > > > > Create your own email signature
> > > > > <
> > > > >
> > > >
> > >
> > https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > >
> > > > Create your own email signature
> > > > <
> > > >
> > >
> > https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > > >
> > > >
> > >
> >
> 
> 
> -- 
> 
> Create your own email signature
> <https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>
> 

Re: hudi dependency conflicts for test

Posted by Shiyan Xu <xu...@gmail.com>.
Hi Lian,

From your 2nd email seems like you downgrade spark to 2.3. Could you try
using spark 2.4+

Please also refer to the release note for dependency versions.
https://hudi.apache.org/releases.html#release-051-incubating-docs



On Thu, May 21, 2020 at 11:59 AM Lian Jiang <ji...@gmail.com> wrote:

> Thanks Shiyan and Vinoth. Unfortunately, adding
> org.apache.spark:spark-avro_2.11:2.4.4 throws another version related
> exception:
>
> java.lang.NoSuchMethodError:
>
> org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
>         at
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
>         at
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
>         at
> org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
>         at scala.collection.Iterator$class.foreach(Iterator.scala:891)
>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
>         at
> scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>         at
> org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
>         at
> org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
>         at
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
>         at
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
>         at
> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
>         at
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>         at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>         at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>         at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>         at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>         at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
>         at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
>         at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
>         at
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
>         at
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
>         at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
>         at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>         at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>         at org.scalatest.Transformer.apply(Transformer.scala:22)
>         at org.scalatest.Transformer.apply(Transformer.scala:20)
>         at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
>         at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
>         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
>         at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
>         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>         at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
>         at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
>         at scala.collection.immutable.List.foreach(List.scala:392)
>         at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
>         at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
>         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
>         at
> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
>         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
>         at org.scalatest.Suite$class.run(Suite.scala:1147)
>         at org.scalatest.FunSuite.org
> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>         at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
>         at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
>         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
>         at
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
>         at
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
>         at
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
>         at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
>         at
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
>         at
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>         at
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
>         at
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
>         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
>         at
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>         at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
>         at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
>         at
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
>         at
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
>         at
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
>         at
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>         at
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>         at
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
>         at java.base/java.lang.Thread.run(Thread.java:835)
>
>
> On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <vi...@apache.org> wrote:
>
> > Wow.. Race condition :) ..
> >
> > Thanks for racing , Raymond!
> >
> > On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <xu...@gmail.com>
> > wrote:
> >
> > > Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in your
> > > classpath.
> > >
> > >
> > >
> > > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com>
> > wrote:
> > >
> > > > Thanks Balaji.
> > > >
> > > > My unit test failed due to dependency incompatibility. Any idea will
> be
> > > > highly appreciated!
> > > >
> > > >
> > > > The test is copied from hudi quick start:
> > > >
> > > > import org.apache.hudi.QuickstartUtils._
> > > >
> > > > import scala.collection.JavaConversions._
> > > > import org.apache.spark.sql.SaveMode._
> > > > import org.apache.hudi.DataSourceReadOptions._
> > > > import org.apache.hudi.DataSourceWriteOptions._
> > > > import org.apache.hudi.config.HoodieWriteConfig._
> > > >
> > > > class InputOutputTest extends HudiBaseTest{
> > > >
> > > > val config = new SparkConf().setAppName(name)
> > > >   config.set("spark.driver.allowMultipleContexts", "true")
> > > >   config.set("spark.serializer",
> > > > "org.apache.spark.serializer.KryoSerializer")
> > > >   config.setMaster("local[*]").setAppName("Local Test")
> > > >   val executionContext =
> > > > SparkSession.builder().config(config).getOrCreate()
> > > >
> > > > val tableName = "hudi_trips_cow"
> > > >   val basePath = "file:///tmp/hudi_trips_cow"
> > > >   val dataGen = new DataGenerator
> > > >
> > > >   override def beforeAll(): Unit = {
> > > >   }
> > > >
> > > >   test("Can create a hudi dataset") {
> > > >     val inserts = convertToStringList(dataGen.generateInserts(10))
> > > >     val df = executionContext.sparkSession.read.json(
> > > >       executionContext.sparkContext.parallelize(inserts, 2))
> > > >
> > > >     df.write.format("hudi").
> > > >       options(getQuickstartWriteConfigs).
> > > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> > > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> > > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> > > >       option(TABLE_NAME, tableName).
> > > >       mode(Overwrite).
> > > >       save(basePath)
> > > >   }
> > > > }
> > > >
> > > >
> > > > The exception is:
> > > >
> > > > java.lang.NoClassDefFoundError:
> > > org/apache/spark/sql/avro/SchemaConverters$
> > > >         at
> > > >
> > >
> >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > > >         at
> > > >
> > >
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > > >         at
> > > > org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > > >         at
> > > > org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > > >         at
> > > >
> > >
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > > >         at
> > > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > > >         at
> > > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > > >         at
> > > >
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > > >         at
> > > >
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >         at
> > > >
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > > >         at
> org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > > >         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > > >         at org.scalatest.Transformer.apply(Transformer.scala:22)
> > > >         at org.scalatest.Transformer.apply(Transformer.scala:20)
> > > >         at
> > > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > > >         at
> > org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > > >         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > > >         at
> > > >
> > >
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > > >         at
> > > >
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >         at
> > > >
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > > >         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > > >         at
> > > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > > >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > > >         at
> > > >
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >         at
> > > >
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > > >         at
> > > >
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > > >         at
> > > >
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > > >         at scala.collection.immutable.List.foreach(List.scala:392)
> > > >         at
> > org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > > >         at org.scalatest.SuperEngine.org
> > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > > >         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > > >         at
> > > > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > > >         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> > > >         at org.scalatest.FunSuite.org
> > > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > > >         at
> > > >
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >         at
> > > >
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > > >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > > >         at
> org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > > >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > > >         at
> > > >
> > >
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > > >         at
> > > >
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > > >         at
> > > >
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > > >         at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > > >         at
> > > >
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > > >         at
> > > >
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > > >         at
> > > >
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > > >         at
> > > >
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > > >         at
> > > >
> > >
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > > >         at
> > > >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > > Method)
> > > >         at
> > > >
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >         at
> > > >
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > > >         at
> > > >
> > >
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > > >         at
> > > >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > > Method)
> > > >         at
> > > >
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > > >         at
> > > >
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > > >         at
> > > >
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > > >         at
> > > >
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > > >         at
> > > >
> > >
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > > >         at java.base/java.lang.Thread.run(Thread.java:835)
> > > > Caused by: java.lang.ClassNotFoundException:
> > > > org.apache.spark.sql.avro.SchemaConverters$
> > > >         at
> > > >
> > >
> >
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> > > >         at
> > > >
> > >
> >
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> > > >         at
> > > java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> > > >         ... 91 more
> > > >
> > > >
> > > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> > > > <v....@ymail.com.invalid> wrote:
> > > >
> > > > >  Thanks for using Hudi. Looking at pom definitions between 0.5.1
> and
> > > > > 0.5.2, I don't see any difference that could cause this issue. As
> it
> > > > works
> > > > > with 0.5.2, I am assuming you are not blocked. Let us know
> otherwise.
> > > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian
> Jiang <
> > > > > jiangok2006@gmail.com> wrote:
> > > > >
> > > > >  Thanks Vinoth.
> > > > >
> > > > > Below dependency has no conflict:
> > > > >
> > > > > compile group: 'org.apache.spark', name: 'spark-core_2.11',
> version:
> > > > > '2.3.0'
> > > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > > > '2.3.0'
> > > > > compile group: 'org.scala-lang', name: 'scala-library', version:
> > > > '2.11.11'
> > > > > compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> > '3.7.1'
> > > > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > '1.11.297'
> > > > > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > > > > version: '0.5.2-incubating'
> > > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> version:
> > > > > '3.2.0-SNAP7'
> > > > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > version:
> > > > > '1.5.12'
> > > > > compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
> > > > > '0.8.0-incubating'
> > > > >
> > > > > Cheers!
> > > > >
> > > > >
> > > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org>
> > > > wrote:
> > > > >
> > > > > > Hi Leon,
> > > > > >
> > > > > > Sorry for the late reply.  Seems like a version mismatch for
> > > mockito..
> > > > > > I see you are already trying to exclude it though.. Could you
> share
> > > the
> > > > > > full stack trace?
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <
> jiangok2006@gmail.com>
> > > > > wrote:
> > > > > >
> > > > > > > Hi,
> > > > > > >
> > > > > > > I am using hudi in a scala gradle project:
> > > > > > >
> > > > > > > dependencies {
> > > > > > >    compile group: 'org.apache.spark', name: 'spark-core_2.11',
> > > > version:
> > > > > > > '2.4.4'
> > > > > > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> > > > version:
> > > > > > > '2.4.4'
> > > > > > >    compile group: 'org.scala-lang', name: 'scala-library',
> > version:
> > > > > > > '2.11.11'
> > > > > > >    compile group: 'com.github.scopt', name: 'scopt_2.11',
> > version:
> > > > > > '3.7.1'
> > > > > > >    compile group: 'org.apache.spark', name: 'spark-avro_2.11',
> > > > version:
> > > > > > > '2.4.4'
> > > > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk',
> version:
> > > > > > > '1.11.297'
> > > > > > >    compile group: 'com.zillow.datacontracts', name:
> > > > > > > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> > > > > > >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > > > > > > version: '0.5.1-incubating') {
> > > > > > >        exclude group: 'org.scala-lang', module: 'scala-library'
> > > > > > >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> > > > > > >    }
> > > > > > >
> > > > > > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > > > >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > > > > > > version: '3.2.0-SNAP7'
> > > > > > >    testCompile group: 'org.mockito', name:
> 'mockito-scala_2.11',
> > > > > > > version: '1.5.12'
> > > > > > > }
> > > > > > >
> > > > > > > Below code throws exception '
> > > > > > > java.lang.NoSuchMethodError:
> > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > > > > >
> > > > > > > import org.junit.runner.RunWith
> > > > > > > import org.scalatest.FunSuite
> > > > > > > import org.scalatest.junit.JUnitRunner
> > > > > > > import org.scalatest.mockito.MockitoSugar
> > > > > > >
> > > > > > > @RunWith(classOf[JUnitRunner])
> > > > > > > class BaseTest extends FunSuite with MockitoSugar {
> > > > > > > }
> > > > > > >
> > > > > > > Removing org.apache.hudi from the dependency list will make the
> > > code
> > > > > > > work. Does anybody know how to include hudi dependency without
> > > > > > > conflicting with the test?
> > > > > > >
> > > > > > > Appreciate any help!
> > > > > > >
> > > > > > > Regards
> > > > > > >
> > > > > > > Leon
> > > > > > >
> > > > > >
> > > > >
> > > > >
> > > > > --
> > > > >
> > > > > Create your own email signature
> > > > > <
> > > > >
> > > >
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > >
> > > > Create your own email signature
> > > > <
> > > >
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > > >
> > > >
> > >
> >
>
>
> --
>
> Create your own email signature
> <
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >
>

Re: hudi dependency conflicts for test

Posted by Lian Jiang <ji...@gmail.com>.
Thanks Shiyan and Vinoth. Unfortunately, adding
org.apache.spark:spark-avro_2.11:2.4.4 throws another version related
exception:

java.lang.NoSuchMethodError:
org.apache.avro.Schema.createUnion([Lorg/apache/avro/Schema;)Lorg/apache/avro/Schema;
	at org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:185)
	at org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:176)
	at org.apache.spark.sql.avro.SchemaConverters$$anonfun$5.apply(SchemaConverters.scala:174)
	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
	at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
	at org.apache.spark.sql.types.StructType.foreach(StructType.scala:99)
	at org.apache.spark.sql.avro.SchemaConverters$.toAvroType(SchemaConverters.scala:174)
	at org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
	at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
	at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
	at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
	at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
	at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
	at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
	at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
	at org.scalatest.Suite$class.run(Suite.scala:1147)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
	at com.zillow.dataforce_storage_poc.HudiBaseTest.org$scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
	at com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
	at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
	at java.base/java.lang.Thread.run(Thread.java:835)


On Thu, May 21, 2020 at 10:46 AM Vinoth Chandar <vi...@apache.org> wrote:

> Wow.. Race condition :) ..
>
> Thanks for racing , Raymond!
>
> On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <xu...@gmail.com>
> wrote:
>
> > Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in your
> > classpath.
> >
> >
> >
> > On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com>
> wrote:
> >
> > > Thanks Balaji.
> > >
> > > My unit test failed due to dependency incompatibility. Any idea will be
> > > highly appreciated!
> > >
> > >
> > > The test is copied from hudi quick start:
> > >
> > > import org.apache.hudi.QuickstartUtils._
> > >
> > > import scala.collection.JavaConversions._
> > > import org.apache.spark.sql.SaveMode._
> > > import org.apache.hudi.DataSourceReadOptions._
> > > import org.apache.hudi.DataSourceWriteOptions._
> > > import org.apache.hudi.config.HoodieWriteConfig._
> > >
> > > class InputOutputTest extends HudiBaseTest{
> > >
> > > val config = new SparkConf().setAppName(name)
> > >   config.set("spark.driver.allowMultipleContexts", "true")
> > >   config.set("spark.serializer",
> > > "org.apache.spark.serializer.KryoSerializer")
> > >   config.setMaster("local[*]").setAppName("Local Test")
> > >   val executionContext =
> > > SparkSession.builder().config(config).getOrCreate()
> > >
> > > val tableName = "hudi_trips_cow"
> > >   val basePath = "file:///tmp/hudi_trips_cow"
> > >   val dataGen = new DataGenerator
> > >
> > >   override def beforeAll(): Unit = {
> > >   }
> > >
> > >   test("Can create a hudi dataset") {
> > >     val inserts = convertToStringList(dataGen.generateInserts(10))
> > >     val df = executionContext.sparkSession.read.json(
> > >       executionContext.sparkContext.parallelize(inserts, 2))
> > >
> > >     df.write.format("hudi").
> > >       options(getQuickstartWriteConfigs).
> > >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> > >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> > >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> > >       option(TABLE_NAME, tableName).
> > >       mode(Overwrite).
> > >       save(basePath)
> > >   }
> > > }
> > >
> > >
> > > The exception is:
> > >
> > > java.lang.NoClassDefFoundError:
> > org/apache/spark/sql/avro/SchemaConverters$
> > >         at
> > >
> >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> > >         at
> > >
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> > >         at
> > > org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> > >         at
> > >
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> > >         at
> > > org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> > >         at
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >         at
> > >
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> > >         at
> > >
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> > >         at
> > >
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> > >         at
> > >
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> > >         at
> > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> > >         at
> > > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> > >         at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> > >         at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >         at
> > >
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> > >         at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> > >         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> > >         at org.scalatest.Transformer.apply(Transformer.scala:22)
> > >         at org.scalatest.Transformer.apply(Transformer.scala:20)
> > >         at
> > org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> > >         at
> org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> > >         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> > >         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> > >         at
> > org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> > >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >         at
> > >
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> > >         at
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> > >         at
> > >
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> > >         at scala.collection.immutable.List.foreach(List.scala:392)
> > >         at
> org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> > >         at org.scalatest.SuperEngine.org
> > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> > >         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> > >         at
> > > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> > >         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> > >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> > >         at org.scalatest.FunSuite.org
> > > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> > >         at
> > > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >         at
> > > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> > >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> > >         at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> > >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> > >         at
> > >
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> > >         at
> > > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> > >         at
> > > com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> > >         at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> > >         at
> > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > >         at
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >         at
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> > >         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> > >         at
> > >
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> > >         at
> > > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > >         at
> > >
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >         at
> > >
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> > >         at
> > >
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> > >         at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> > >         at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> > >         at
> > >
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> > >         at
> > >
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> > >         at
> > >
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> > >         at
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> > >         at
> > >
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> > >         at
> > >
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> > >         at java.base/java.lang.Thread.run(Thread.java:835)
> > > Caused by: java.lang.ClassNotFoundException:
> > > org.apache.spark.sql.avro.SchemaConverters$
> > >         at
> > >
> >
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> > >         at
> > >
> >
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> > >         at
> > java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> > >         ... 91 more
> > >
> > >
> > > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> > > <v....@ymail.com.invalid> wrote:
> > >
> > > >  Thanks for using Hudi. Looking at pom definitions between 0.5.1 and
> > > > 0.5.2, I don't see any difference that could cause this issue. As it
> > > works
> > > > with 0.5.2, I am assuming you are not blocked. Let us know otherwise.
> > > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian Jiang <
> > > > jiangok2006@gmail.com> wrote:
> > > >
> > > >  Thanks Vinoth.
> > > >
> > > > Below dependency has no conflict:
> > > >
> > > > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > > > '2.3.0'
> > > > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > > '2.3.0'
> > > > compile group: 'org.scala-lang', name: 'scala-library', version:
> > > '2.11.11'
> > > > compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> '3.7.1'
> > > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > '1.11.297'
> > > > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > > > version: '0.5.2-incubating'
> > > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> > > > '3.2.0-SNAP7'
> > > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> version:
> > > > '1.5.12'
> > > > compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
> > > > '0.8.0-incubating'
> > > >
> > > > Cheers!
> > > >
> > > >
> > > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org>
> > > wrote:
> > > >
> > > > > Hi Leon,
> > > > >
> > > > > Sorry for the late reply.  Seems like a version mismatch for
> > mockito..
> > > > > I see you are already trying to exclude it though.. Could you share
> > the
> > > > > full stack trace?
> > > > >
> > > > >
> > > > >
> > > > >
> > > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com>
> > > > wrote:
> > > > >
> > > > > > Hi,
> > > > > >
> > > > > > I am using hudi in a scala gradle project:
> > > > > >
> > > > > > dependencies {
> > > > > >    compile group: 'org.apache.spark', name: 'spark-core_2.11',
> > > version:
> > > > > > '2.4.4'
> > > > > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> > > version:
> > > > > > '2.4.4'
> > > > > >    compile group: 'org.scala-lang', name: 'scala-library',
> version:
> > > > > > '2.11.11'
> > > > > >    compile group: 'com.github.scopt', name: 'scopt_2.11',
> version:
> > > > > '3.7.1'
> > > > > >    compile group: 'org.apache.spark', name: 'spark-avro_2.11',
> > > version:
> > > > > > '2.4.4'
> > > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > > > > '1.11.297'
> > > > > >    compile group: 'com.zillow.datacontracts', name:
> > > > > > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> > > > > >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > > > > > version: '0.5.1-incubating') {
> > > > > >        exclude group: 'org.scala-lang', module: 'scala-library'
> > > > > >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> > > > > >    }
> > > > > >
> > > > > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > > >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > > > > > version: '3.2.0-SNAP7'
> > > > > >    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > > > > > version: '1.5.12'
> > > > > > }
> > > > > >
> > > > > > Below code throws exception '
> > > > > > java.lang.NoSuchMethodError:
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > > > >
> > > > > > import org.junit.runner.RunWith
> > > > > > import org.scalatest.FunSuite
> > > > > > import org.scalatest.junit.JUnitRunner
> > > > > > import org.scalatest.mockito.MockitoSugar
> > > > > >
> > > > > > @RunWith(classOf[JUnitRunner])
> > > > > > class BaseTest extends FunSuite with MockitoSugar {
> > > > > > }
> > > > > >
> > > > > > Removing org.apache.hudi from the dependency list will make the
> > code
> > > > > > work. Does anybody know how to include hudi dependency without
> > > > > > conflicting with the test?
> > > > > >
> > > > > > Appreciate any help!
> > > > > >
> > > > > > Regards
> > > > > >
> > > > > > Leon
> > > > > >
> > > > >
> > > >
> > > >
> > > > --
> > > >
> > > > Create your own email signature
> > > > <
> > > >
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > > >
> > > >
> > >
> > >
> > >
> > > --
> > >
> > > Create your own email signature
> > > <
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > >
> > >
> >
>


-- 

Create your own email signature
<https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>

Re: hudi dependency conflicts for test

Posted by Vinoth Chandar <vi...@apache.org>.
Wow.. Race condition :) ..

Thanks for racing , Raymond!

On Thu, May 21, 2020 at 10:08 AM Shiyan Xu <xu...@gmail.com>
wrote:

> Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in your
> classpath.
>
>
>
> On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com> wrote:
>
> > Thanks Balaji.
> >
> > My unit test failed due to dependency incompatibility. Any idea will be
> > highly appreciated!
> >
> >
> > The test is copied from hudi quick start:
> >
> > import org.apache.hudi.QuickstartUtils._
> >
> > import scala.collection.JavaConversions._
> > import org.apache.spark.sql.SaveMode._
> > import org.apache.hudi.DataSourceReadOptions._
> > import org.apache.hudi.DataSourceWriteOptions._
> > import org.apache.hudi.config.HoodieWriteConfig._
> >
> > class InputOutputTest extends HudiBaseTest{
> >
> > val config = new SparkConf().setAppName(name)
> >   config.set("spark.driver.allowMultipleContexts", "true")
> >   config.set("spark.serializer",
> > "org.apache.spark.serializer.KryoSerializer")
> >   config.setMaster("local[*]").setAppName("Local Test")
> >   val executionContext =
> > SparkSession.builder().config(config).getOrCreate()
> >
> > val tableName = "hudi_trips_cow"
> >   val basePath = "file:///tmp/hudi_trips_cow"
> >   val dataGen = new DataGenerator
> >
> >   override def beforeAll(): Unit = {
> >   }
> >
> >   test("Can create a hudi dataset") {
> >     val inserts = convertToStringList(dataGen.generateInserts(10))
> >     val df = executionContext.sparkSession.read.json(
> >       executionContext.sparkContext.parallelize(inserts, 2))
> >
> >     df.write.format("hudi").
> >       options(getQuickstartWriteConfigs).
> >       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
> >       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
> >       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
> >       option(TABLE_NAME, tableName).
> >       mode(Overwrite).
> >       save(basePath)
> >   }
> > }
> >
> >
> > The exception is:
> >
> > java.lang.NoClassDefFoundError:
> org/apache/spark/sql/avro/SchemaConverters$
> >         at
> >
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
> >         at
> >
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
> >         at
> > org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
> >         at
> >
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
> >         at
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
> >         at
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
> >         at
> >
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
> >         at
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
> >         at
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
> >         at
> >
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
> >         at
> >
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> >         at
> >
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
> >         at
> > org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
> >         at
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
> >         at
> >
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
> >         at
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >         at
> >
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
> >         at
> >
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
> >         at
> >
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
> >         at
> >
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
> >         at
> >
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
> >         at
> >
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
> >         at
> > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
> >         at
> > org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
> >         at
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
> >         at
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >         at
> >
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
> >         at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> >         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> >         at org.scalatest.Transformer.apply(Transformer.scala:22)
> >         at org.scalatest.Transformer.apply(Transformer.scala:20)
> >         at
> org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> >         at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
> >         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
> >         at
> >
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
> >         at
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >         at
> >
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
> >         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> >         at
> org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
> >         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
> >         at
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >         at
> >
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
> >         at
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
> >         at
> >
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
> >         at scala.collection.immutable.List.foreach(List.scala:392)
> >         at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> >         at org.scalatest.SuperEngine.org
> > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
> >         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> >         at
> > org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
> >         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> >         at org.scalatest.Suite$class.run(Suite.scala:1147)
> >         at org.scalatest.FunSuite.org
> > $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> >         at
> > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >         at
> > org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
> >         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> >         at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
> >         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> > $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
> >         at
> >
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
> >         at
> > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
> >         at
> > com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
> >         at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
> >         at
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
> >         at
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
> >         at
> >
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
> >         at
> >
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
> >         at
> >
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
> >         at
> > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> >         at
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >         at
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >         at
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >         at
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >         at
> >
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
> >         at
> >
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
> >         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
> >         at
> >
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
> >         at
> > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> >         at
> >
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >         at
> >
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
> >         at
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
> >         at
> >
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
> >         at
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
> >         at
> >
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
> >         at
> >
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
> >         at
> >
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
> >         at
> >
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
> >         at
> >
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
> >         at
> >
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
> >         at
> >
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
> >         at java.base/java.lang.Thread.run(Thread.java:835)
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.spark.sql.avro.SchemaConverters$
> >         at
> >
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
> >         at
> >
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
> >         at
> java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
> >         ... 91 more
> >
> >
> > On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> > <v....@ymail.com.invalid> wrote:
> >
> > >  Thanks for using Hudi. Looking at pom definitions between 0.5.1 and
> > > 0.5.2, I don't see any difference that could cause this issue. As it
> > works
> > > with 0.5.2, I am assuming you are not blocked. Let us know otherwise.
> > > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian Jiang <
> > > jiangok2006@gmail.com> wrote:
> > >
> > >  Thanks Vinoth.
> > >
> > > Below dependency has no conflict:
> > >
> > > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > > '2.3.0'
> > > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > '2.3.0'
> > > compile group: 'org.scala-lang', name: 'scala-library', version:
> > '2.11.11'
> > > compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
> > > compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> '1.11.297'
> > > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > > version: '0.5.2-incubating'
> > > testCompile group: 'junit', name: 'junit', version: '4.12'
> > > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> > > '3.2.0-SNAP7'
> > > testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version:
> > > '1.5.12'
> > > compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
> > > '0.8.0-incubating'
> > >
> > > Cheers!
> > >
> > >
> > > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org>
> > wrote:
> > >
> > > > Hi Leon,
> > > >
> > > > Sorry for the late reply.  Seems like a version mismatch for
> mockito..
> > > > I see you are already trying to exclude it though.. Could you share
> the
> > > > full stack trace?
> > > >
> > > >
> > > >
> > > >
> > > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com>
> > > wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > I am using hudi in a scala gradle project:
> > > > >
> > > > > dependencies {
> > > > >    compile group: 'org.apache.spark', name: 'spark-core_2.11',
> > version:
> > > > > '2.4.4'
> > > > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> > version:
> > > > > '2.4.4'
> > > > >    compile group: 'org.scala-lang', name: 'scala-library', version:
> > > > > '2.11.11'
> > > > >    compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> > > > '3.7.1'
> > > > >    compile group: 'org.apache.spark', name: 'spark-avro_2.11',
> > version:
> > > > > '2.4.4'
> > > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > > > '1.11.297'
> > > > >    compile group: 'com.zillow.datacontracts', name:
> > > > > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> > > > >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > > > > version: '0.5.1-incubating') {
> > > > >        exclude group: 'org.scala-lang', module: 'scala-library'
> > > > >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> > > > >    }
> > > > >
> > > > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > > > >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > > > > version: '3.2.0-SNAP7'
> > > > >    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > > > > version: '1.5.12'
> > > > > }
> > > > >
> > > > > Below code throws exception '
> > > > > java.lang.NoSuchMethodError:
> > > > >
> > > > >
> > > >
> > >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > > >
> > > > > import org.junit.runner.RunWith
> > > > > import org.scalatest.FunSuite
> > > > > import org.scalatest.junit.JUnitRunner
> > > > > import org.scalatest.mockito.MockitoSugar
> > > > >
> > > > > @RunWith(classOf[JUnitRunner])
> > > > > class BaseTest extends FunSuite with MockitoSugar {
> > > > > }
> > > > >
> > > > > Removing org.apache.hudi from the dependency list will make the
> code
> > > > > work. Does anybody know how to include hudi dependency without
> > > > > conflicting with the test?
> > > > >
> > > > > Appreciate any help!
> > > > >
> > > > > Regards
> > > > >
> > > > > Leon
> > > > >
> > > >
> > >
> > >
> > > --
> > >
> > > Create your own email signature
> > > <
> > >
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > > >
> > >
> >
> >
> >
> > --
> >
> > Create your own email signature
> > <
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > >
> >
>

Re: hudi dependency conflicts for test

Posted by Shiyan Xu <xu...@gmail.com>.
Hi Lian, it appears that you need to have spark-avro_2.11:2.4.4 in your
classpath.



On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com> wrote:

> Thanks Balaji.
>
> My unit test failed due to dependency incompatibility. Any idea will be
> highly appreciated!
>
>
> The test is copied from hudi quick start:
>
> import org.apache.hudi.QuickstartUtils._
>
> import scala.collection.JavaConversions._
> import org.apache.spark.sql.SaveMode._
> import org.apache.hudi.DataSourceReadOptions._
> import org.apache.hudi.DataSourceWriteOptions._
> import org.apache.hudi.config.HoodieWriteConfig._
>
> class InputOutputTest extends HudiBaseTest{
>
> val config = new SparkConf().setAppName(name)
>   config.set("spark.driver.allowMultipleContexts", "true")
>   config.set("spark.serializer",
> "org.apache.spark.serializer.KryoSerializer")
>   config.setMaster("local[*]").setAppName("Local Test")
>   val executionContext =
> SparkSession.builder().config(config).getOrCreate()
>
> val tableName = "hudi_trips_cow"
>   val basePath = "file:///tmp/hudi_trips_cow"
>   val dataGen = new DataGenerator
>
>   override def beforeAll(): Unit = {
>   }
>
>   test("Can create a hudi dataset") {
>     val inserts = convertToStringList(dataGen.generateInserts(10))
>     val df = executionContext.sparkSession.read.json(
>       executionContext.sparkContext.parallelize(inserts, 2))
>
>     df.write.format("hudi").
>       options(getQuickstartWriteConfigs).
>       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
>       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
>       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
>       option(TABLE_NAME, tableName).
>       mode(Overwrite).
>       save(basePath)
>   }
> }
>
>
> The exception is:
>
> java.lang.NoClassDefFoundError: org/apache/spark/sql/avro/SchemaConverters$
>         at
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
>         at
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
>         at
> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
>         at
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>         at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>         at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>         at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>         at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>         at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
>         at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
>         at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
>         at
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
>         at
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
>         at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
>         at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>         at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>         at org.scalatest.Transformer.apply(Transformer.scala:22)
>         at org.scalatest.Transformer.apply(Transformer.scala:20)
>         at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
>         at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
>         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
>         at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
>         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>         at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
>         at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
>         at scala.collection.immutable.List.foreach(List.scala:392)
>         at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
>         at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
>         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
>         at
> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
>         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
>         at org.scalatest.Suite$class.run(Suite.scala:1147)
>         at org.scalatest.FunSuite.org
> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>         at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
>         at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
>         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
>         at
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
>         at
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
>         at
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
>         at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
>         at
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
>         at
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>         at
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
>         at
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
>         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
>         at
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>         at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
>         at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
>         at
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
>         at
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
>         at
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
>         at
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>         at
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>         at
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
>         at java.base/java.lang.Thread.run(Thread.java:835)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.sql.avro.SchemaConverters$
>         at
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
>         at
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>         at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
>         ... 91 more
>
>
> On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> <v....@ymail.com.invalid> wrote:
>
> >  Thanks for using Hudi. Looking at pom definitions between 0.5.1 and
> > 0.5.2, I don't see any difference that could cause this issue. As it
> works
> > with 0.5.2, I am assuming you are not blocked. Let us know otherwise.
> > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian Jiang <
> > jiangok2006@gmail.com> wrote:
> >
> >  Thanks Vinoth.
> >
> > Below dependency has no conflict:
> >
> > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > '2.3.0'
> > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> '2.3.0'
> > compile group: 'org.scala-lang', name: 'scala-library', version:
> '2.11.11'
> > compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
> > compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
> > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > version: '0.5.2-incubating'
> > testCompile group: 'junit', name: 'junit', version: '4.12'
> > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> > '3.2.0-SNAP7'
> > testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version:
> > '1.5.12'
> > compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
> > '0.8.0-incubating'
> >
> > Cheers!
> >
> >
> > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org>
> wrote:
> >
> > > Hi Leon,
> > >
> > > Sorry for the late reply.  Seems like a version mismatch for mockito..
> > > I see you are already trying to exclude it though.. Could you share the
> > > full stack trace?
> > >
> > >
> > >
> > >
> > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com>
> > wrote:
> > >
> > > > Hi,
> > > >
> > > > I am using hudi in a scala gradle project:
> > > >
> > > > dependencies {
> > > >    compile group: 'org.apache.spark', name: 'spark-core_2.11',
> version:
> > > > '2.4.4'
> > > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> version:
> > > > '2.4.4'
> > > >    compile group: 'org.scala-lang', name: 'scala-library', version:
> > > > '2.11.11'
> > > >    compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> > > '3.7.1'
> > > >    compile group: 'org.apache.spark', name: 'spark-avro_2.11',
> version:
> > > > '2.4.4'
> > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > > '1.11.297'
> > > >    compile group: 'com.zillow.datacontracts', name:
> > > > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> > > >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > > > version: '0.5.1-incubating') {
> > > >        exclude group: 'org.scala-lang', module: 'scala-library'
> > > >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> > > >    }
> > > >
> > > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > > >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > > > version: '3.2.0-SNAP7'
> > > >    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > > > version: '1.5.12'
> > > > }
> > > >
> > > > Below code throws exception '
> > > > java.lang.NoSuchMethodError:
> > > >
> > > >
> > >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > >
> > > > import org.junit.runner.RunWith
> > > > import org.scalatest.FunSuite
> > > > import org.scalatest.junit.JUnitRunner
> > > > import org.scalatest.mockito.MockitoSugar
> > > >
> > > > @RunWith(classOf[JUnitRunner])
> > > > class BaseTest extends FunSuite with MockitoSugar {
> > > > }
> > > >
> > > > Removing org.apache.hudi from the dependency list will make the code
> > > > work. Does anybody know how to include hudi dependency without
> > > > conflicting with the test?
> > > >
> > > > Appreciate any help!
> > > >
> > > > Regards
> > > >
> > > > Leon
> > > >
> > >
> >
> >
> > --
> >
> > Create your own email signature
> > <
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > >
> >
>
>
>
> --
>
> Create your own email signature
> <
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >
>

Re: hudi dependency conflicts for test

Posted by Vinoth Chandar <vi...@apache.org>.
Hi,

you may also need org.apache.spark:spark-avro_2.11:2.4.4 as a dependency,
as you see on the spark-shell example..

Thanks
Vinoth

On Thu, May 21, 2020 at 10:04 AM Lian Jiang <ji...@gmail.com> wrote:

> Thanks Balaji.
>
> My unit test failed due to dependency incompatibility. Any idea will be
> highly appreciated!
>
>
> The test is copied from hudi quick start:
>
> import org.apache.hudi.QuickstartUtils._
>
> import scala.collection.JavaConversions._
> import org.apache.spark.sql.SaveMode._
> import org.apache.hudi.DataSourceReadOptions._
> import org.apache.hudi.DataSourceWriteOptions._
> import org.apache.hudi.config.HoodieWriteConfig._
>
> class InputOutputTest extends HudiBaseTest{
>
> val config = new SparkConf().setAppName(name)
>   config.set("spark.driver.allowMultipleContexts", "true")
>   config.set("spark.serializer",
> "org.apache.spark.serializer.KryoSerializer")
>   config.setMaster("local[*]").setAppName("Local Test")
>   val executionContext =
> SparkSession.builder().config(config).getOrCreate()
>
> val tableName = "hudi_trips_cow"
>   val basePath = "file:///tmp/hudi_trips_cow"
>   val dataGen = new DataGenerator
>
>   override def beforeAll(): Unit = {
>   }
>
>   test("Can create a hudi dataset") {
>     val inserts = convertToStringList(dataGen.generateInserts(10))
>     val df = executionContext.sparkSession.read.json(
>       executionContext.sparkContext.parallelize(inserts, 2))
>
>     df.write.format("hudi").
>       options(getQuickstartWriteConfigs).
>       option(PRECOMBINE_FIELD_OPT_KEY, "ts").
>       option(RECORDKEY_FIELD_OPT_KEY, "uuid").
>       option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
>       option(TABLE_NAME, tableName).
>       mode(Overwrite).
>       save(basePath)
>   }
> }
>
>
> The exception is:
>
> java.lang.NoClassDefFoundError: org/apache/spark/sql/avro/SchemaConverters$
>         at
> org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
>         at
> org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
>         at
> org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
>         at
> org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>         at
> org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
>         at
> org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
>         at
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>         at
> org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
>         at
> org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
>         at
> org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
>         at
> org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
>         at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
>         at
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
>         at
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
>         at
> org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
>         at
> org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
>         at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
>         at
> org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>         at
> com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
>         at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>         at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>         at org.scalatest.Transformer.apply(Transformer.scala:22)
>         at org.scalatest.Transformer.apply(Transformer.scala:20)
>         at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
>         at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
>         at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
>         at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
>         at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
>         at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>         at
> org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
>         at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
>         at
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
>         at scala.collection.immutable.List.foreach(List.scala:392)
>         at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
>         at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
>         at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
>         at
> org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
>         at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
>         at org.scalatest.Suite$class.run(Suite.scala:1147)
>         at org.scalatest.FunSuite.org
> $scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
>         at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>         at
> org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
>         at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
>         at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
>         at com.zillow.dataforce_storage_poc.HudiBaseTest.org
> $scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
>         at
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
>         at
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
>         at
> com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
>         at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
>         at
> org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
>         at
> org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
>         at
> org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>         at
> org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
>         at
> org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
>         at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
>         at
> org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>         at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.base/java.lang.reflect.Method.invoke(Method.java:567)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
>         at
> org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
>         at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
>         at
> org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
>         at
> org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
>         at
> org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
>         at
> org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
>         at
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>         at
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>         at
> org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
>         at java.base/java.lang.Thread.run(Thread.java:835)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.sql.avro.SchemaConverters$
>         at
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
>         at
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>         at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
>         ... 91 more
>
>
> On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
> <v....@ymail.com.invalid> wrote:
>
> >  Thanks for using Hudi. Looking at pom definitions between 0.5.1 and
> > 0.5.2, I don't see any difference that could cause this issue. As it
> works
> > with 0.5.2, I am assuming you are not blocked. Let us know otherwise.
> > Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian Jiang <
> > jiangok2006@gmail.com> wrote:
> >
> >  Thanks Vinoth.
> >
> > Below dependency has no conflict:
> >
> > compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > '2.3.0'
> > compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> '2.3.0'
> > compile group: 'org.scala-lang', name: 'scala-library', version:
> '2.11.11'
> > compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
> > compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
> > compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> > version: '0.5.2-incubating'
> > testCompile group: 'junit', name: 'junit', version: '4.12'
> > testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> > '3.2.0-SNAP7'
> > testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version:
> > '1.5.12'
> > compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
> > '0.8.0-incubating'
> >
> > Cheers!
> >
> >
> > On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org>
> wrote:
> >
> > > Hi Leon,
> > >
> > > Sorry for the late reply.  Seems like a version mismatch for mockito..
> > > I see you are already trying to exclude it though.. Could you share the
> > > full stack trace?
> > >
> > >
> > >
> > >
> > > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com>
> > wrote:
> > >
> > > > Hi,
> > > >
> > > > I am using hudi in a scala gradle project:
> > > >
> > > > dependencies {
> > > >    compile group: 'org.apache.spark', name: 'spark-core_2.11',
> version:
> > > > '2.4.4'
> > > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11',
> version:
> > > > '2.4.4'
> > > >    compile group: 'org.scala-lang', name: 'scala-library', version:
> > > > '2.11.11'
> > > >    compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> > > '3.7.1'
> > > >    compile group: 'org.apache.spark', name: 'spark-avro_2.11',
> version:
> > > > '2.4.4'
> > > >    compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > > '1.11.297'
> > > >    compile group: 'com.zillow.datacontracts', name:
> > > > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> > > >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > > > version: '0.5.1-incubating') {
> > > >        exclude group: 'org.scala-lang', module: 'scala-library'
> > > >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> > > >    }
> > > >
> > > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > > >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > > > version: '3.2.0-SNAP7'
> > > >    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > > > version: '1.5.12'
> > > > }
> > > >
> > > > Below code throws exception '
> > > > java.lang.NoSuchMethodError:
> > > >
> > > >
> > >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > > >
> > > > import org.junit.runner.RunWith
> > > > import org.scalatest.FunSuite
> > > > import org.scalatest.junit.JUnitRunner
> > > > import org.scalatest.mockito.MockitoSugar
> > > >
> > > > @RunWith(classOf[JUnitRunner])
> > > > class BaseTest extends FunSuite with MockitoSugar {
> > > > }
> > > >
> > > > Removing org.apache.hudi from the dependency list will make the code
> > > > work. Does anybody know how to include hudi dependency without
> > > > conflicting with the test?
> > > >
> > > > Appreciate any help!
> > > >
> > > > Regards
> > > >
> > > > Leon
> > > >
> > >
> >
> >
> > --
> >
> > Create your own email signature
> > <
> >
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> > >
> >
>
>
>
> --
>
> Create your own email signature
> <
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >
>

Re: hudi dependency conflicts for test

Posted by Lian Jiang <ji...@gmail.com>.
Thanks Balaji.

My unit test failed due to dependency incompatibility. Any idea will be
highly appreciated!


The test is copied from hudi quick start:

import org.apache.hudi.QuickstartUtils._

import scala.collection.JavaConversions._
import org.apache.spark.sql.SaveMode._
import org.apache.hudi.DataSourceReadOptions._
import org.apache.hudi.DataSourceWriteOptions._
import org.apache.hudi.config.HoodieWriteConfig._

class InputOutputTest extends HudiBaseTest{

val config = new SparkConf().setAppName(name)
  config.set("spark.driver.allowMultipleContexts", "true")
  config.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
  config.setMaster("local[*]").setAppName("Local Test")
  val executionContext = SparkSession.builder().config(config).getOrCreate()

val tableName = "hudi_trips_cow"
  val basePath = "file:///tmp/hudi_trips_cow"
  val dataGen = new DataGenerator

  override def beforeAll(): Unit = {
  }

  test("Can create a hudi dataset") {
    val inserts = convertToStringList(dataGen.generateInserts(10))
    val df = executionContext.sparkSession.read.json(
      executionContext.sparkContext.parallelize(inserts, 2))

    df.write.format("hudi").
      options(getQuickstartWriteConfigs).
      option(PRECOMBINE_FIELD_OPT_KEY, "ts").
      option(RECORDKEY_FIELD_OPT_KEY, "uuid").
      option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath").
      option(TABLE_NAME, tableName).
      mode(Overwrite).
      save(basePath)
  }
}


The exception is:

java.lang.NoClassDefFoundError: org/apache/spark/sql/avro/SchemaConverters$
	at org.apache.hudi.AvroConversionUtils$.convertStructTypeToAvroSchema(AvroConversionUtils.scala:87)
	at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:93)
	at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
	at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
	at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
	at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply$mcV$sp(InputOutputTest.scala:34)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
	at com.zillow.dataforce_storage_poc.hudi.InputOutputTest$$anonfun$1.apply(InputOutputTest.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
	at org.scalatest.TestSuite$class.withFixture(TestSuite.scala:196)
	at org.scalatest.FunSuite.withFixture(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
	at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
	at org.scalatest.Suite$class.run(Suite.scala:1147)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
	at com.zillow.dataforce_storage_poc.HudiBaseTest.org$scalatest$BeforeAndAfterAll$$super$run(HudiBaseTest.scala:5)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
	at com.zillow.dataforce_storage_poc.HudiBaseTest.run(HudiBaseTest.scala:5)
	at org.scalatest.junit.JUnitRunner.run(JUnitRunner.scala:99)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
	at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
	at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
	at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
	at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
	at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
	at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:118)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182)
	at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164)
	at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:412)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
	at java.base/java.lang.Thread.run(Thread.java:835)
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.sql.avro.SchemaConverters$
	at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:583)
	at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
	... 91 more


On Wed, May 20, 2020 at 1:43 PM Balaji Varadarajan
<v....@ymail.com.invalid> wrote:

>  Thanks for using Hudi. Looking at pom definitions between 0.5.1 and
> 0.5.2, I don't see any difference that could cause this issue. As it works
> with 0.5.2, I am assuming you are not blocked. Let us know otherwise.
> Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian Jiang <
> jiangok2006@gmail.com> wrote:
>
>  Thanks Vinoth.
>
> Below dependency has no conflict:
>
> compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> '2.3.0'
> compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.3.0'
> compile group: 'org.scala-lang', name: 'scala-library', version: '2.11.11'
> compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
> compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
> compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
> version: '0.5.2-incubating'
> testCompile group: 'junit', name: 'junit', version: '4.12'
> testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
> '3.2.0-SNAP7'
> testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version:
> '1.5.12'
> compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
> '0.8.0-incubating'
>
> Cheers!
>
>
> On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org> wrote:
>
> > Hi Leon,
> >
> > Sorry for the late reply.  Seems like a version mismatch for mockito..
> > I see you are already trying to exclude it though.. Could you share the
> > full stack trace?
> >
> >
> >
> >
> > On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com>
> wrote:
> >
> > > Hi,
> > >
> > > I am using hudi in a scala gradle project:
> > >
> > > dependencies {
> > >    compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > > '2.4.4'
> > >    compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > > '2.4.4'
> > >    compile group: 'org.scala-lang', name: 'scala-library', version:
> > > '2.11.11'
> > >    compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> > '3.7.1'
> > >    compile group: 'org.apache.spark', name: 'spark-avro_2.11', version:
> > > '2.4.4'
> > >    compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > > '1.11.297'
> > >    compile group: 'com.zillow.datacontracts', name:
> > > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> > >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > > version: '0.5.1-incubating') {
> > >        exclude group: 'org.scala-lang', module: 'scala-library'
> > >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> > >    }
> > >
> > >    testCompile group: 'junit', name: 'junit', version: '4.12'
> > >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > > version: '3.2.0-SNAP7'
> > >    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > > version: '1.5.12'
> > > }
> > >
> > > Below code throws exception '
> > > java.lang.NoSuchMethodError:
> > >
> > >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> > >
> > > import org.junit.runner.RunWith
> > > import org.scalatest.FunSuite
> > > import org.scalatest.junit.JUnitRunner
> > > import org.scalatest.mockito.MockitoSugar
> > >
> > > @RunWith(classOf[JUnitRunner])
> > > class BaseTest extends FunSuite with MockitoSugar {
> > > }
> > >
> > > Removing org.apache.hudi from the dependency list will make the code
> > > work. Does anybody know how to include hudi dependency without
> > > conflicting with the test?
> > >
> > > Appreciate any help!
> > >
> > > Regards
> > >
> > > Leon
> > >
> >
>
>
> --
>
> Create your own email signature
> <
> https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592
> >
>



-- 

Create your own email signature
<https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>

Re: hudi dependency conflicts for test

Posted by Balaji Varadarajan <v....@ymail.com.INVALID>.
 Thanks for using Hudi. Looking at pom definitions between 0.5.1 and 0.5.2, I don't see any difference that could cause this issue. As it works with 0.5.2, I am assuming you are not blocked. Let us know otherwise.
Balaji.V    On Wednesday, May 20, 2020, 01:17:08 PM PDT, Lian Jiang <ji...@gmail.com> wrote:  
 
 Thanks Vinoth.

Below dependency has no conflict:

compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.3.0'
compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.3.0'
compile group: 'org.scala-lang', name: 'scala-library', version: '2.11.11'
compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
version: '0.5.2-incubating'
testCompile group: 'junit', name: 'junit', version: '4.12'
testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
'3.2.0-SNAP7'
testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version: '1.5.12'
compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
'0.8.0-incubating'

Cheers!


On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org> wrote:

> Hi Leon,
>
> Sorry for the late reply.  Seems like a version mismatch for mockito..
> I see you are already trying to exclude it though.. Could you share the
> full stack trace?
>
>
>
>
> On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com> wrote:
>
> > Hi,
> >
> > I am using hudi in a scala gradle project:
> >
> > dependencies {
> >    compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > '2.4.4'
> >    compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > '2.4.4'
> >    compile group: 'org.scala-lang', name: 'scala-library', version:
> > '2.11.11'
> >    compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> '3.7.1'
> >    compile group: 'org.apache.spark', name: 'spark-avro_2.11', version:
> > '2.4.4'
> >    compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > '1.11.297'
> >    compile group: 'com.zillow.datacontracts', name:
> > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> >    compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > version: '0.5.1-incubating') {
> >        exclude group: 'org.scala-lang', module: 'scala-library'
> >        exclude group: 'org.scalatest', module: 'scalatest_2.12'
> >    }
> >
> >    testCompile group: 'junit', name: 'junit', version: '4.12'
> >    testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > version: '3.2.0-SNAP7'
> >    testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > version: '1.5.12'
> > }
> >
> > Below code throws exception '
> > java.lang.NoSuchMethodError:
> >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> >
> > import org.junit.runner.RunWith
> > import org.scalatest.FunSuite
> > import org.scalatest.junit.JUnitRunner
> > import org.scalatest.mockito.MockitoSugar
> >
> > @RunWith(classOf[JUnitRunner])
> > class BaseTest extends FunSuite with MockitoSugar {
> > }
> >
> > Removing org.apache.hudi from the dependency list will make the code
> > work. Does anybody know how to include hudi dependency without
> > conflicting with the test?
> >
> > Appreciate any help!
> >
> > Regards
> >
> > Leon
> >
>


-- 

Create your own email signature
<https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>
  

Re: hudi dependency conflicts for test

Posted by Lian Jiang <ji...@gmail.com>.
Thanks Vinoth.

Below dependency has no conflict:

compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.3.0'
compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.3.0'
compile group: 'org.scala-lang', name: 'scala-library', version: '2.11.11'
compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
compile group: 'com.amazonaws', name: 'aws-java-sdk', version: '1.11.297'
compile group: 'org.apache.hudi', name: 'hudi-spark-bundle_2.11',
version: '0.5.2-incubating'
testCompile group: 'junit', name: 'junit', version: '4.12'
testCompile group: 'org.scalatest', name: 'scalatest_2.11', version:
'3.2.0-SNAP7'
testCompile group: 'org.mockito', name: 'mockito-scala_2.11', version: '1.5.12'
compile group: 'org.apache.iceberg', name: 'iceberg-api', version:
'0.8.0-incubating'

Cheers!


On Wed, May 20, 2020 at 5:00 AM Vinoth Chandar <vi...@apache.org> wrote:

> Hi Leon,
>
> Sorry for the late reply.  Seems like a version mismatch for mockito..
> I see you are already trying to exclude it though.. Could you share the
> full stack trace?
>
>
>
>
> On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com> wrote:
>
> > Hi,
> >
> > I am using hudi in a scala gradle project:
> >
> > dependencies {
> >     compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> > '2.4.4'
> >     compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> > '2.4.4'
> >     compile group: 'org.scala-lang', name: 'scala-library', version:
> > '2.11.11'
> >     compile group: 'com.github.scopt', name: 'scopt_2.11', version:
> '3.7.1'
> >     compile group: 'org.apache.spark', name: 'spark-avro_2.11', version:
> > '2.4.4'
> >     compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> > '1.11.297'
> >     compile group: 'com.zillow.datacontracts', name:
> > 'contract-evaluation-library', version: '0.1.0.master.98a438b'
> >     compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> > version: '0.5.1-incubating') {
> >         exclude group: 'org.scala-lang', module: 'scala-library'
> >         exclude group: 'org.scalatest', module: 'scalatest_2.12'
> >     }
> >
> >     testCompile group: 'junit', name: 'junit', version: '4.12'
> >     testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> > version: '3.2.0-SNAP7'
> >     testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> > version: '1.5.12'
> > }
> >
> > Below code throws exception '
> > java.lang.NoSuchMethodError:
> >
> >
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
> >
> > import org.junit.runner.RunWith
> > import org.scalatest.FunSuite
> > import org.scalatest.junit.JUnitRunner
> > import org.scalatest.mockito.MockitoSugar
> >
> > @RunWith(classOf[JUnitRunner])
> > class BaseTest extends FunSuite with MockitoSugar {
> > }
> >
> > Removing org.apache.hudi from the dependency list will make the code
> > work. Does anybody know how to include hudi dependency without
> > conflicting with the test?
> >
> > Appreciate any help!
> >
> > Regards
> >
> > Leon
> >
>


-- 

Create your own email signature
<https://www.wisestamp.com/signature-in-email/?utm_source=promotion&utm_medium=signature&utm_campaign=create_your_own&srcid=5234462839406592>

Re: hudi dependency conflicts for test

Posted by Vinoth Chandar <vi...@apache.org>.
Hi Leon,

Sorry for the late reply.  Seems like a version mismatch for mockito..
I see you are already trying to exclude it though.. Could you share the
full stack trace?




On Mon, May 18, 2020 at 1:12 PM Lian Jiang <ji...@gmail.com> wrote:

> Hi,
>
> I am using hudi in a scala gradle project:
>
> dependencies {
>     compile group: 'org.apache.spark', name: 'spark-core_2.11', version:
> '2.4.4'
>     compile group: 'org.apache.spark', name: 'spark-sql_2.11', version:
> '2.4.4'
>     compile group: 'org.scala-lang', name: 'scala-library', version:
> '2.11.11'
>     compile group: 'com.github.scopt', name: 'scopt_2.11', version: '3.7.1'
>     compile group: 'org.apache.spark', name: 'spark-avro_2.11', version:
> '2.4.4'
>     compile group: 'com.amazonaws', name: 'aws-java-sdk', version:
> '1.11.297'
>     compile group: 'com.zillow.datacontracts', name:
> 'contract-evaluation-library', version: '0.1.0.master.98a438b'
>     compile (group: 'org.apache.hudi', name: 'hudi-spark_2.11',
> version: '0.5.1-incubating') {
>         exclude group: 'org.scala-lang', module: 'scala-library'
>         exclude group: 'org.scalatest', module: 'scalatest_2.12'
>     }
>
>     testCompile group: 'junit', name: 'junit', version: '4.12'
>     testCompile group: 'org.scalatest', name: 'scalatest_2.11',
> version: '3.2.0-SNAP7'
>     testCompile group: 'org.mockito', name: 'mockito-scala_2.11',
> version: '1.5.12'
> }
>
> Below code throws exception '
> java.lang.NoSuchMethodError:
>
> org.scalatest.mockito.MockitoSugar.$init$(Lorg/scalatest/mockito/MockitoSugar;)V'
>
> import org.junit.runner.RunWith
> import org.scalatest.FunSuite
> import org.scalatest.junit.JUnitRunner
> import org.scalatest.mockito.MockitoSugar
>
> @RunWith(classOf[JUnitRunner])
> class BaseTest extends FunSuite with MockitoSugar {
> }
>
> Removing org.apache.hudi from the dependency list will make the code
> work. Does anybody know how to include hudi dependency without
> conflicting with the test?
>
> Appreciate any help!
>
> Regards
>
> Leon
>