You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/08 14:09:05 UTC

[GitHub] [spark] mazeboard edited a comment on issue #24299: [SPARK-27388][SQL] expression encoder for objects defined by properties

mazeboard edited a comment on issue #24299: [SPARK-27388][SQL] expression encoder for objects defined by properties
URL: https://github.com/apache/spark/pull/24299#issuecomment-480845183
 
 
   1. JavaBean do not support Avro fixed types, because the fixed type has one
   property and is named `bytes`; javaBean only accepts properties fixed with
   set/get
   
   2. I believe that the current implementation has a bug, indeed
   
   Line 136
   in sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/JavaTypeInference.scala
   should be:
       val properties = getJavaBeanReadableAndWritableProperties(other) 
   and not
        val properties = getJavaBeanReadableProperties(other) 
   
   3. toDS, toDF, always
   uses the expression encoder I corrected the bug mentioned above and tested
   it, by using Encoders.bean, and I had three issues:
    1. the ds.map fails with no encoder found because it is using the ScalaReflection to match for
   an encoder 
    2. the Encoders.bean fails for java enums (assertion fails, not a StructType since an enum is saved as String) 
    3. and for Avro fixed types are not supported because the property in fixed types is not prefixed by
   set/get
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org