You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/09/01 15:46:11 UTC

[GitHub] [hudi] rjmblc commented on issue #6341: [SUPPORT] Hudi delete not working via spark apis

rjmblc commented on issue #6341:
URL: https://github.com/apache/hudi/issues/6341#issuecomment-1234463865

   @nsivabalan I added both these config ` "spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension"  "spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog" ` and removed the payload class
    I’m stuck with the following error message tried changing the hudi bundle version to 11. Any tips for further troubleshooting?
   Error Message:
   `py4j.protocol.Py4JJavaError: An error occurred while calling o90.save.
   : org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'spark_catalog': org.apache.spark.sql.hudi.catalog.HoodieCatalog`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org