You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/07/16 07:02:31 UTC

[GitHub] [spark] yyhpys edited a comment on issue #21012: [SPARK-23890][SQL] Support CHANGE COLUMN to add nested fields to structs

yyhpys edited a comment on issue #21012: [SPARK-23890][SQL] Support CHANGE COLUMN to add nested fields to structs
URL: https://github.com/apache/spark/pull/21012#issuecomment-511692305
 
 
   @ottomata Hi. I have been using your PR in a production thrift server. Really appreciate for the contribution.
   But since there was an type comparison issue with ArrayType field, I am using it with the following additional modification.
   
   ```scala
     private def columnsCompatible(
       ours: StructField,
       theirs: StructField,
       resolver: Resolver
     ): Boolean = {
       // If we can't resolve between the column names, then these are not compatible.
       if (!resolver(ours.name, theirs.name)) {
         return false
       }
       //match dataType
       typeCompatible(ours.dataType, theirs.dataType, resolver)
     }
     
     private def typeCompatible(
       ourType: DataType,
       theirType: DataType,
       resolver: Resolver
     ): Boolean = {
       (ourType, theirType) match {
         // If dataTypes are the same, then these columns are compatible
         case (a, b) if a == b => true
         // If both are StructTypes, recurse and ensure that each of
         // the common fields recursively have identical types.
         case (ourSchema: StructType, theirSchema: StructType) =>
           // Get all fields in theirSchema that are also in ourSchema
           // and make sure that all of these fields are also compatible.
           theirSchema
           .filter(field => findColumnByName(ourSchema, field.name, resolver).isDefined)
           .forall { theirField =>
             columnsCompatible(
               findColumnByName(ourSchema, theirField.name, resolver).get,
               theirField,
               resolver
             )
           }
         case (ourSchema: ArrayType, theirSchema: ArrayType) =>
           // if both are ArrayTypes, then match their element type
           typeCompatible(ourSchema.elementType, theirSchema.elementType, resolver)
         case _ => false
       }
     }
   ```
   
   Hope you consider this case and reinforce this PR if needed. Thanks again.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org