You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2020/07/28 07:31:00 UTC

[jira] [Resolved] (SPARK-32176) Automatic type promotion to ArrayType in defined schema in from_json is broken

     [ https://issues.apache.org/jira/browse/SPARK-32176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-32176.
----------------------------------
    Resolution: Cannot Reproduce

> Automatic type promotion to ArrayType in defined schema in from_json is broken
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-32176
>                 URL: https://issues.apache.org/jira/browse/SPARK-32176
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Abhishek Adhikari
>            Priority: Major
>
>  
> In spark 2.4, I'm able to read data where I have data in mixed types, by defining col "stats" as StringType and later parse the inner data
>   
>  stats_def = StructType().add("hour",IntegerType(),True).add("hits",IntegerType(),True)
>  df2 = df.select(f.col("stats"),f.from_json(f.col("stats"),ArrayType(stats_def)).alias("stats_array"))
>  df2.show(5,False)
>  df2.printSchema
>   
> ||stats||stats_array||
> |[\{"hour":3,"hits":1},\{"hour":4,"hits":1}]|[[3, 1], [4, 1]]|
> |{"hits":20}|[[, 20]]|
> <bound method DataFrame.printSchema of DataFrame[*stats: string, stats_array: array<struct<hour:int,hits:int>>*]>
>   
>  In spark 3.0.0 it throws error -
>  java.lang.ClassCastException: java.lang.Integer cannot be cast to org.apache.spark.sql.catalyst.util.ArrayData
>   
>  I think it was an important feature and should be supported, maybe with the help of from_json options.
>   



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org