You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Prasanna Saraswathi Krishnan (Jira)" <ji...@apache.org> on 2019/10/11 17:49:00 UTC
[jira] [Reopened] (SPARK-29432) nullable flag of new column changes
when persisting a pyspark dataframe
[ https://issues.apache.org/jira/browse/SPARK-29432?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Prasanna Saraswathi Krishnan reopened SPARK-29432:
--------------------------------------------------
Edited the commands to reproduce the issue. The issue is not resolved.
> nullable flag of new column changes when persisting a pyspark dataframe
> -----------------------------------------------------------------------
>
> Key: SPARK-29432
> URL: https://issues.apache.org/jira/browse/SPARK-29432
> Project: Spark
> Issue Type: Question
> Components: SQL
> Affects Versions: 2.4.0
> Environment: Spark 2.4.0-cdh6.1.1 (Cloudera distribution)
> Python 3.7.3
> Reporter: Prasanna Saraswathi Krishnan
> Priority: Minor
>
> When I add a new column to a dataframe with {{withColumn}} function, by default, the column is added with {{nullable=false}}.
> But, when I save the dataframe, the flag changes to {{nullable=true}}. Is this the expected behavior? why?
>
> {code:java}
> >>> l = [('Alice', 1)]
> >>> df = spark.createDataFrame(l)
> >>> df.printSchema()
> root
> |-- _1: string (nullable = true)
> |-- _2: long (nullable = true)
> {code}
> {code:java}
> >>> from pyspark.sql.functions import lit
> >>> df = df.withColumn('newCol', lit('newVal'))
> >>> df.printSchema()
> root
> |-- _1: string (nullable = true)
> |-- _2: long (nullable = true)
> |-- newCol: string (nullable = false)
> >>> df.write.saveAsTable('default.withcolTest', mode='overwrite')
> >>> spark.sql("select * from default.withcolTest").printSchema()
> |-- _1: string (nullable = true)
> |-- _2: long (nullable = true)
> |-- newCol: string (nullable = true)
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org