You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "caokaizhi (via GitHub)" <gi...@apache.org> on 2023/03/08 03:44:08 UTC

[GitHub] [hudi] caokaizhi commented on issue #8061: [SUPPORT]Unable to read hudi table and got an IllegalArgumentException: For input string: "null"

caokaizhi commented on issue #8061:
URL: https://github.com/apache/hudi/issues/8061#issuecomment-1459350020

   I also encountered this problem when using hudi 0.13.0 on spark 3.3.2 and found that an exception was thrown when querying the mor table and the merge type was "REALTIME_PAYLOAD_COMBINE". The reason for this is that spark 3.3.2 is not compatible with the ParquetToSparkSchemaConverter class of spark 3.3.1. The constructor method of the ParquetToSparkSchemaConverterl class in spark 3.3.2 requires the "LEGACY_PARQUET_NANOS_AS_LONG" configuration parameter, whereas in The buildReaderWithPartitionValues method of the Spark32PlusHoodieParquetFileFormatl class does not initialize the value of this parameter. So my conclusion is that hudi 0.13.0 is currently not compatible with spark 3.3.2.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org