You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2019/09/10 08:52:39 UTC

[GitHub] [incubator-hudi] cdmikechen edited a comment on issue #869: Hudi Spark error when spark bundle jar is added to spark's classpath

cdmikechen edited a comment on issue #869: Hudi Spark error when spark bundle jar is added to spark's classpath
URL: https://github.com/apache/incubator-hudi/issues/869#issuecomment-529839338
 
 
   It always happens when sparksession is started in a JVM environment and tasks are submitted to spark. I have encountered this problem many times when I start a springboot project and use a sparksession to run some codes with udf function.
   In my opinion, this problem may be a bug for serialization and deserialization between Java and Scala. 
   If we use `spark-submit` or `spark-shell` or local submit to run some codes, it is in a single spark 
    environment and same scala environment itself. But if we run these codes in ide or a java environment to call spark, it must do some serialization and deserialization, and then report this error.
   My solution is to run any programs with `spark-submit`, and that won't happen.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services