You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by we...@apache.org on 2021/08/06 04:53:40 UTC
[spark] branch master updated: [SPARK-36429][SQL] JacksonParser
should throw exception when data type unsupported
This is an automated email from the ASF dual-hosted git repository.
wenchen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new eb12727 [SPARK-36429][SQL] JacksonParser should throw exception when data type unsupported
eb12727 is described below
commit eb12727bc72235b21badd20854564cd0128e969e
Author: gengjiaan <ge...@360.cn>
AuthorDate: Fri Aug 6 12:53:04 2021 +0800
[SPARK-36429][SQL] JacksonParser should throw exception when data type unsupported
### What changes were proposed in this pull request?
Currently, when `set spark.sql.timestampType=TIMESTAMP_NTZ`, the behavior is different between `from_json` and `from_csv`.
```
-- !query
select from_json('{"t":"26/October/2015"}', 't Timestamp', map('timestampFormat', 'dd/MMMMM/yyyy'))
-- !query schema
struct<from_json({"t":"26/October/2015"}):struct<t:timestamp_ntz>>
-- !query output
{"t":null}
```
```
-- !query
select from_csv('26/October/2015', 't Timestamp', map('timestampFormat', 'dd/MMMMM/yyyy'))
-- !query schema
struct<>
-- !query output
java.lang.Exception
Unsupported type: timestamp_ntz
```
We should make `from_json` throws exception too.
This PR fix the discussion below
https://github.com/apache/spark/pull/33640#discussion_r682862523
### Why are the changes needed?
Make the behavior of `from_json` more reasonable.
### Does this PR introduce _any_ user-facing change?
'Yes'.
from_json throwing Exception when we set spark.sql.timestampType=TIMESTAMP_NTZ.
### How was this patch tested?
Tests updated.
Closes #33654 from beliefer/SPARK-36429.
Authored-by: gengjiaan <ge...@360.cn>
Signed-off-by: Wenchen Fan <we...@databricks.com>
---
.../scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala | 8 ++------
.../resources/sql-tests/results/timestampNTZ/timestamp.sql.out | 5 +++--
2 files changed, 5 insertions(+), 8 deletions(-)
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala
index dfa746f..fe1fa87 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/json/JacksonParser.scala
@@ -330,12 +330,8 @@ class JacksonParser(
case udt: UserDefinedType[_] =>
makeConverter(udt.sqlType)
- case _ =>
- (parser: JsonParser) =>
- // Here, we pass empty `PartialFunction` so that this case can be
- // handled as a failed conversion. It will throw an exception as
- // long as the value is not null.
- parseJsonToken[AnyRef](parser, dataType)(PartialFunction.empty[JsonToken, AnyRef])
+ // We don't actually hit this exception though, we keep it for understandability
+ case _ => throw QueryExecutionErrors.unsupportedTypeError(dataType)
}
/**
diff --git a/sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out b/sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out
index b8a6800..c6de535 100644
--- a/sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/timestampNTZ/timestamp.sql.out
@@ -642,9 +642,10 @@ You may get a different result due to the upgrading of Spark 3.0: Fail to recogn
-- !query
select from_json('{"t":"26/October/2015"}', 't Timestamp', map('timestampFormat', 'dd/MMMMM/yyyy'))
-- !query schema
-struct<from_json({"t":"26/October/2015"}):struct<t:timestamp_ntz>>
+struct<>
-- !query output
-{"t":null}
+java.lang.Exception
+Unsupported type: timestamp_ntz
-- !query
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org