You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "nicolas paris (Jira)" <ji...@apache.org> on 2021/09/13 13:27:00 UTC

[jira] [Created] (HUDI-2426) spark sql extensions breaks read.table from metastore

nicolas paris created HUDI-2426:
-----------------------------------

             Summary: spark sql extensions breaks read.table from metastore
                 Key: HUDI-2426
                 URL: https://issues.apache.org/jira/browse/HUDI-2426
             Project: Apache Hudi
          Issue Type: Bug
          Components: Hive Integration
            Reporter: nicolas paris


when adding the hudi spark sql support, this breaks the ability to read a hudi metastore from spark:

 bash-4.2$ ./spark3.0.2/bin/spark-shell --packages org.apache.hudi:hudi-spark3-bundle_2.12:0.9.0,org.apache.spark:spark-avro_2.12:3.1.2 --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'

 

scala> spark.table("default.test_hudi_table").show
java.lang.UnsupportedOperationException: Unsupported parseMultipartIdentifier method
 at org.apache.spark.sql.parser.HoodieCommonSqlParser.parseMultipartIdentifier(HoodieCommonSqlParser.scala:65)
 at org.apache.spark.sql.SparkSession.table(SparkSession.scala:581)
 ... 47 elided

 

removing the config makes the hive table readable again from spark

this affect at least spark 3.0.x and 3.1.x



--
This message was sent by Atlassian Jira
(v8.3.4#803005)