You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jianbang Xian (Jira)" <ji...@apache.org> on 2022/08/31 08:32:00 UTC
[jira] [Created] (SPARK-40289) The result is strange when casting string to date in ORC reading via Schema Evolution
Jianbang Xian created SPARK-40289:
-------------------------------------
Summary: The result is strange when casting string to date in ORC reading via Schema Evolution
Key: SPARK-40289
URL: https://issues.apache.org/jira/browse/SPARK-40289
Project: Spark
Issue Type: Bug
Components: Spark Shell
Affects Versions: 3.1.1
Environment: * Ubuntu 1804 LTS
* Spark 311
Reporter: Jianbang Xian
I created an ORC file by the code as follows.
{code:java}
val data = Seq(
("", "2022-01-32"), // pay attention to this, null
("", "9808-02-30"), // pay attention to this, 9808-02-29
("", "2022-06-31"), // pay attention to this, 2022-06-30
)
val cols = Seq("str", "date_str")
val df=spark.createDataFrame(data).toDF(cols:_*).repartition(1)
df.printSchema()
df.show(100)
df.write.mode("overwrite").orc("/tmp/orc/data.orc")
{code}
Please note that these three cases are invalid date.
And I read it via:
{code:java}
scala> var df = spark.read.schema("date_str date").orc("/tmp/orc/data.orc"); df.show()
+----------+
| date_str|
+----------+
| null|
|9808-02-29|
|2022-06-30|
+----------+{code}
Why is `2022-01-31` converted to `null`, while `9808-02-30` is converted to `9808-02-29`?
Intuitively, they are invalid date, we should return 3 nulls. Is it a bug or a feature?
*Background*
* I am working on the project: [https://github.com/NVIDIA/spark-rapids]
* And I am working on a feature, that is to support reading ORC file as an cuDF (CUDA DataFrame). cuDF is an in-memory data-format of GPU.
* I need to follow the behaviors of ORC reading in CPU. Otherwise, the users of spark-rapids will feel strange with the results.
* Therefore I want to know why those happpened.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org