You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by gengliangwang <gi...@git.apache.org> on 2018/06/01 17:35:38 UTC

[GitHub] spark pull request #21439: [SPARK-24391][SQL] Support arrays of any types by...

Github user gengliangwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21439#discussion_r192465130
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala ---
    @@ -548,7 +553,9 @@ case class JsonToStructs(
           forceNullableSchema = SQLConf.get.getConf(SQLConf.FROM_JSON_FORCE_NULLABLE_SCHEMA))
     
       override def checkInputDataTypes(): TypeCheckResult = nullableSchema match {
    -    case _: StructType | ArrayType(_: StructType, _) | _: MapType =>
    +    case ArrayType(_: StructType, _) if unpackArray =>
    --- End diff --
    
    Even if `unpackArray` is `false`, the next branch in line 558 still do `super.checkInputDataTypes()` for any `ArrayType`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org