You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2017/07/11 14:05:02 UTC

[jira] [Resolved] (SPARK-21365) Deduplicate logics parsing DDL-like type definition

     [ https://issues.apache.org/jira/browse/SPARK-21365?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-21365.
---------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0

Issue resolved by pull request 18590
[https://github.com/apache/spark/pull/18590]

> Deduplicate logics parsing DDL-like type definition
> ---------------------------------------------------
>
>                 Key: SPARK-21365
>                 URL: https://issues.apache.org/jira/browse/SPARK-21365
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.2.0
>            Reporter: Hyukjin Kwon
>             Fix For: 2.3.0
>
>
> It looks we duplicate https://github.com/apache/spark/blob/d492cc5a21cd67b3999b85d97f5c41c3734b1ba3/python/pyspark/sql/types.py#L823-L845 logic for parsing DDL-like type definitions.
> There are also two more points here:
> - This does not support "field type" but "field: type".
> - This does not support nested schemas. For example as below:
> {code}
> >>> spark.createDataFrame([[[1]]], "struct<a: struct<b: int>>").show()
> ...
> ValueError: The strcut field string format is: 'field_name:field_type', but got: a: struct<b: int>
> {code}
> {code}
> >>> spark.createDataFrame([[[1]]], "a: struct<b: int>").show()
> ...
> ValueError: The strcut field string format is: 'field_name:field_type', but got: a: struct<b: int>
> {code}
> {code}
> >>> spark.createDataFrame([[[1]]], "a int").show()
> ...
> ValueError: Could not parse datatype: a int
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org