You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Todd Nemet (JIRA)" <ji...@apache.org> on 2016/10/10 20:20:20 UTC

[jira] [Created] (SPARK-17857) SHOW TABLES IN schema throws exception if schema doesn't exist

Todd Nemet created SPARK-17857:
----------------------------------

             Summary: SHOW TABLES IN schema throws exception if schema doesn't exist
                 Key: SPARK-17857
                 URL: https://issues.apache.org/jira/browse/SPARK-17857
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.0.1, 2.0.0
            Reporter: Todd Nemet
            Priority: Minor


SHOW TABLES IN badschema; throws org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException if badschema doesn't exist. In Spark 1.x it would return an empty result set.

On Spark 2.0.1:

{code}
[683|12:45:56] ~/Documents/spark/spark$ bin/beeline -u jdbc:hive2://localhost:10006/ -n hive
Connecting to jdbc:hive2://localhost:10006/
16/10/10 12:46:00 INFO jdbc.Utils: Supplied authorities: localhost:10006
16/10/10 12:46:00 INFO jdbc.Utils: Resolved authority: localhost:10006
16/10/10 12:46:00 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://localhost:10006/
Connected to: Spark SQL (version 2.0.1)
Driver: Hive JDBC (version 1.2.1.spark2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1.spark2 by Apache Hive
0: jdbc:hive2://localhost:10006/> show schemas;
+-----------------------+--+
|     databaseName      |
+-----------------------+--+
| default               |
| looker_scratch        |
| spark_jira            |
| spark_looker_scratch  |
| spark_looker_test     |
+-----------------------+--+
5 rows selected (0.61 seconds)
0: jdbc:hive2://localhost:10006/> show tables in spark_looker_test;
+--------------+--------------+--+
|  tableName   | isTemporary  |
+--------------+--------------+--+
| all_types    | false        |
| order_items  | false        |
| orders       | false        |
| users        | false        |
+--------------+--------------+--+
4 rows selected (0.611 seconds)
0: jdbc:hive2://localhost:10006/> show tables in badschema;
Error: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'badschema' not found; (state=,code=0)
{code}

On Spark 1.6.2:

{code}
[680|12:47:26] ~/Documents/spark/spark$ bin/beeline -u jdbc:hive2://localhost:10005/ -n hive
Connecting to jdbc:hive2://localhost:10005/
16/10/10 12:47:29 INFO jdbc.Utils: Supplied authorities: localhost:10005
16/10/10 12:47:29 INFO jdbc.Utils: Resolved authority: localhost:10005
16/10/10 12:47:30 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://localhost:10005/
Connected to: Spark SQL (version 1.6.2)
Driver: Hive JDBC (version 1.2.1.spark2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1.spark2 by Apache Hive
0: jdbc:hive2://localhost:10005/> show schemas;
+--------------------+--+
|       result       |
+--------------------+--+
| default            |
| spark_jira         |
| spark_looker_test  |
| spark_scratch      |
+--------------------+--+
4 rows selected (0.613 seconds)
0: jdbc:hive2://localhost:10005/> show tables in spark_looker_test;
+--------------+--------------+--+
|  tableName   | isTemporary  |
+--------------+--------------+--+
| all_types    | false        |
| order_items  | false        |
| orders       | false        |
| users        | false        |
+--------------+--------------+--+
4 rows selected (0.575 seconds)
0: jdbc:hive2://localhost:10005/> show tables in badschema;
+------------+--------------+--+
| tableName  | isTemporary  |
+------------+--------------+--+
+------------+--------------+--+
No rows selected (0.458 seconds)
{code}

[Relevant part of Hive QL docs|https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-ShowTables]




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org