You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2021/11/19 13:28:09 UTC

[GitHub] [hudi] Limess commented on issue #4043: [SUPPORT] java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.sql.Row error when writing particular source data after column addition

Limess commented on issue #4043:
URL: https://github.com/apache/hudi/issues/4043#issuecomment-974072982


   OK
   
   We got this to work again: we had to:
   
   1. Set `hoodie.datasource.write.reconcile.schema=false`
   2. Add the column to any new data. Without doing this, all the old data failed the schema validation step
   
   I'd assumed `hoodie.datasource.write.reconcile.schema=true` was soely capturing this use case so I don't understand why this fails here.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org