You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/01/25 09:46:00 UTC
[jira] [Commented] (SPARK-27612) Creating a DataFrame in PySpark
with ArrayType produces some Rows with Arrays of None
[ https://issues.apache.org/jira/browse/SPARK-27612?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17023481#comment-17023481 ]
Dongjoon Hyun commented on SPARK-27612:
---------------------------------------
I also did double-check that this is not required in branch-2.4 still.
To distinguish this from the other correctness issue, I set `Target Version` as `3.0.0`.
> Creating a DataFrame in PySpark with ArrayType produces some Rows with Arrays of None
> -------------------------------------------------------------------------------------
>
> Key: SPARK-27612
> URL: https://issues.apache.org/jira/browse/SPARK-27612
> Project: Spark
> Issue Type: Bug
> Components: PySpark, SQL
> Affects Versions: 3.0.0
> Reporter: Bryan Cutler
> Assignee: Hyukjin Kwon
> Priority: Blocker
> Labels: correctness
> Fix For: 3.0.0
>
>
> This seems to only affect Python 3.
> When creating a DataFrame with type {{ArrayType(IntegerType(), True)}} there ends up being rows that are filled with None.
>
> {code:java}
> In [1]: from pyspark.sql.types import ArrayType, IntegerType
> In [2]: df = spark.createDataFrame([[1, 2, 3, 4]] * 100, ArrayType(IntegerType(), True))
> In [3]: df.distinct().collect()
> Out[3]: [Row(value=[None, None, None, None]), Row(value=[1, 2, 3, 4])]
> {code}
>
> From this example, it is consistently at elements 97, 98:
> {code}
> In [5]: df.collect()[-5:]
> Out[5]:
> [Row(value=[1, 2, 3, 4]),
> Row(value=[1, 2, 3, 4]),
> Row(value=[None, None, None, None]),
> Row(value=[None, None, None, None]),
> Row(value=[1, 2, 3, 4])]
> {code}
> This also happens with a type of {{ArrayType(ArrayType(IntegerType(), True))}}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org