You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2020/01/27 22:26:10 UTC

[GitHub] [incubator-hudi] haospotai edited a comment on issue #1284: [SUPPORT]

haospotai edited a comment on issue #1284: [SUPPORT]
URL: https://github.com/apache/incubator-hudi/issues/1284#issuecomment-578983670
 
 
   > @haospotai It looks like an issue with shading of jars -> The hadoop metastore jars are shaded in the bundle that you might be using but somehow a class hierarchy is expecting a non-shaded one ?
   > 
   > How are you starting the python client ? Could you paste the libs you are passing ? Also, please raise a JIRA for further discussions
   
   ```
       def __init__(self, app_name):
           load_dotenv('../.env')
   
          # hudi-spark-bundle-0.5.0-incubating.jar
           spark_bundle = ROOT_DIR + "/resources/" + os.environ['HUDI_SPARK_BUNDLE']
           self.host = os.environ['HDFS_HOST']
           self.hive_base_path = os.environ['HIVE_BASH_PATH']
           self.spark = SparkSession.builder \
               .master(os.environ['SPARK_MASTER']) \
               .appName(app_name) \
               .config("spark.jars", spark_bundle) \
               .config("spark.driver.extraClassPath", spark_bundle) \
               .getOrCreate()
   ```
   The exception will be thrown by enabled ```"hoodie.datasource.hive_sync.enable", "true"```
   Cuz the app syncs data to hive at the end

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services