You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Todd Nemet (JIRA)" <ji...@apache.org> on 2016/10/07 00:47:20 UTC

[jira] [Created] (SPARK-17818) Cannot SELECT NULL

Todd Nemet created SPARK-17818:
----------------------------------

             Summary: Cannot SELECT NULL
                 Key: SPARK-17818
                 URL: https://issues.apache.org/jira/browse/SPARK-17818
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.0.1, 2.0.0
            Reporter: Todd Nemet


When connecting to Spark SQL via JDBC/thriftserver, running `SELECT NULL` returns this error:

{code}
0: jdbc:hive2://localhost:10016/> select cast(NULL as NULL);
Error: org.apache.spark.sql.catalyst.parser.ParseException: 
DataType null() is not supported.(line 1, pos 20)
{code}

This is a regression from Spark 1.x/Hive QL compatibility. In Spark 1.5.2, The same query returns NULL:

{code}
0: jdbc:hive2://localhost:10015/> select null;
+-------+--+
|  _c0  |
+-------+--+
| NULL  |
+-------+--+
1 row selected (0.658 seconds)
{code}

I can select expressions that evaluate to NULL or CAST NULL to different types:
{code}
0: jdbc:hive2://localhost:10016/> select null=null
0: jdbc:hive2://localhost:10016/> ;
+----------------+--+
| (NULL = NULL)  |
+----------------+--+
| NULL           |
+----------------+--+
1 row selected (0.69 seconds)
0: jdbc:hive2://localhost:10016/> select cast (NULL as int);
+--------------------+--+
| CAST(NULL AS INT)  |
+--------------------+--+
| NULL               |
+--------------------+--+
1 row selected (0.676 seconds)
0: jdbc:hive2://localhost:10016/> select cast (NULL as string);
+-----------------------+--+
| CAST(NULL AS STRING)  |
+-----------------------+--+
| NULL                  |
+-----------------------+--+
{code}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org