You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "Raymond Xu (Jira)" <ji...@apache.org> on 2021/11/22 10:30:00 UTC

[jira] [Resolved] (HUDI-2455) Fail to build Spark caused by spark_avro dependency

     [ https://issues.apache.org/jira/browse/HUDI-2455?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Raymond Xu resolved HUDI-2455.
------------------------------

> Fail to build Spark caused by spark_avro dependency
> ---------------------------------------------------
>
>                 Key: HUDI-2455
>                 URL: https://issues.apache.org/jira/browse/HUDI-2455
>             Project: Apache Hudi
>          Issue Type: Bug
>            Reporter: Yann Byron
>            Assignee: Yann Byron
>            Priority: Major
>              Labels: easyfix, pull-request-available
>             Fix For: 0.10.0
>
>
> Fail to build hudi using this command:
>  
> {code:java}
> mvn clean package -DskipTests
> {code}
>  
> The key error infomation:
>  
> {code:java}
> [ERROR] /hudi/hudi-integ-test/src/main/scala/org/apache/hudi/integ/testsuite/utils/SparkSqlUtils.scala:32: error: object avro is not a member of package org.apache.spark.sql [ERROR] import org.apache.spark.sql.avro.SchemaConverters 
> [ERROR] ^ 
> [ERROR] /hudi/hudi-integ-test/src/main/scala/org/apache/hudi/integ/testsuite/utils/SparkSqlUtils.scala:142: error: not found: value SchemaConverters [ERROR] val structType = SchemaConverters.toSqlType(schema).dataType.asInstanceOf[StructType] [ERROR] ^ [ERROR] two errors found
> {code}
>  
> Adding the spark_avro dependency to hudi-integ-test module works.
> {code:java}
> <dependency>
>  <groupId>org.apache.spark</groupId>
>  <artifactId>spark-avro_${scala.binary.version}</artifactId>
> </dependency>
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)