You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "sivabalan narayanan (Jira)" <ji...@apache.org> on 2022/01/09 19:41:00 UTC

[jira] [Commented] (HUDI-2426) spark sql extensions breaks read.table from metastore

    [ https://issues.apache.org/jira/browse/HUDI-2426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17471459#comment-17471459 ] 

sivabalan narayanan commented on HUDI-2426:
-------------------------------------------

Hey [~parisni] : Can you please update us with any further updates on your end. We are looking to see if there is any bug here and if we can get it in for 0.10.1. If the proposed solution worked for you, let us know. we can close the jira.

> spark sql extensions breaks read.table from metastore
> -----------------------------------------------------
>
>                 Key: HUDI-2426
>                 URL: https://issues.apache.org/jira/browse/HUDI-2426
>             Project: Apache Hudi
>          Issue Type: Bug
>          Components: Spark Integration
>            Reporter: nicolas paris
>            Assignee: Yann Byron
>            Priority: Critical
>              Labels: sev:critical, user-support-issues
>             Fix For: 0.10.1
>
>
> when adding the hudi spark sql support, this breaks the ability to read a hudi metastore from spark:
>  bash-4.2$ ./spark3.0.2/bin/spark-shell --packages org.apache.hudi:hudi-spark3-bundle_2.12:0.9.0,org.apache.spark:spark-avro_2.12:3.1.2 --conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" --conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension'
>  
> scala> spark.table("default.test_hudi_table").show
> java.lang.UnsupportedOperationException: Unsupported parseMultipartIdentifier method
>  at org.apache.spark.sql.parser.HoodieCommonSqlParser.parseMultipartIdentifier(HoodieCommonSqlParser.scala:65)
>  at org.apache.spark.sql.SparkSession.table(SparkSession.scala:581)
>  ... 47 elided
>  
> removing the config makes the hive table readable again from spark
> this affect at least spark 3.0.x and 3.1.x



--
This message was sent by Atlassian Jira
(v8.20.1#820001)