You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Steven Bakhtiari (JIRA)" <ji...@apache.org> on 2018/09/27 10:14:00 UTC
[jira] [Comment Edited] (SPARK-19950) nullable ignored when
df.load() is executed for file-based data source
[ https://issues.apache.org/jira/browse/SPARK-19950?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16630113#comment-16630113 ]
Steven Bakhtiari edited comment on SPARK-19950 at 9/27/18 10:13 AM:
--------------------------------------------------------------------
Is there any movement on this issue (or rationale for making the schema nullable)?
I raised a duplicate issue with an example of loading a CSV file SPARK-25545
was (Author: stevebakh):
Is there any movement on this issue (or rationale for making the schema nullable)?
I raised a duplicate issue SPARK-25545
> nullable ignored when df.load() is executed for file-based data source
> ----------------------------------------------------------------------
>
> Key: SPARK-19950
> URL: https://issues.apache.org/jira/browse/SPARK-19950
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.2.0
> Reporter: Kazuaki Ishizaki
> Priority: Major
>
> This problem is reported in [Databricks forum|https://forums.databricks.com/questions/7123/nullable-seemingly-ignored-when-reading-parquet.html].
> When we execute the following code, a schema for "id" in {{dfRead}} has {{nullable = true}}. It should be {{nullable = false}}.
> {code:java}
> val field = "id"
> val df = spark.range(0, 5, 1, 1).toDF(field)
> val fmt = "parquet"
> val path = "/tmp/parquet"
> val schema = StructType(Seq(StructField(field, LongType, false)))
> df.write.format(fmt).mode("overwrite").save(path)
> val dfRead = spark.read.format(fmt).schema(schema).load(path)
> dfRead.printSchema
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org