You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "yikaifei (Jira)" <ji...@apache.org> on 2023/04/07 07:46:00 UTC
[jira] [Created] (SPARK-43063) `df.show` handle null should print NULL instead of null
yikaifei created SPARK-43063:
--------------------------------
Summary: `df.show` handle null should print NULL instead of null
Key: SPARK-43063
URL: https://issues.apache.org/jira/browse/SPARK-43063
Project: Spark
Issue Type: Improvement
Components: SQL
Affects Versions: 3.4.0
Reporter: yikaifei
`df.show` handle null should print NULL instead of null to consistent behavior;
Like as the following behavior is currently inconsistent:
``` shell
scala> spark.sql("select decode(6, 1, 'Southlake', 2, 'San Francisco', 3, 'New Jersey', 4, 'Seattle') as result").show(false)
+------+
|result|
+------+
|null |
+------+
```
``` shell
spark-sql> DESC FUNCTION EXTENDED decode;
function_desc
Function: decode
Class: org.apache.spark.sql.catalyst.expressions.Decode
Usage:
decode(bin, charset) - Decodes the first argument using the second argument character set.
decode(expr, search, result [, search, result ] ... [, default]) - Compares expr
to each search value in order. If expr is equal to a search value, decode returns
the corresponding result. If no match is found, then it returns default. If default
is omitted, it returns null.
Extended Usage:
Examples:
> SELECT decode(encode('abc', 'utf-8'), 'utf-8');
abc
> SELECT decode(2, 1, 'Southlake', 2, 'San Francisco', 3, 'New Jersey', 4, 'Seattle', 'Non domestic');
San Francisco
> SELECT decode(6, 1, 'Southlake', 2, 'San Francisco', 3, 'New Jersey', 4, 'Seattle', 'Non domestic');
Non domestic
> SELECT decode(6, 1, 'Southlake', 2, 'San Francisco', 3, 'New Jersey', 4, 'Seattle');
NULL
Since: 3.2.0
Time taken: 0.074 seconds, Fetched 4 row(s)
```
``` shell
spark-sql> select decode(6, 1, 'Southlake', 2, 'San Francisco', 3, 'New Jersey', 4, 'Seattle');
NULL
```
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org