You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Zhongshuai Pei (JIRA)" <ji...@apache.org> on 2015/04/17 11:17:58 UTC
[jira] [Updated] (SPARK-6976) "drop table if exists src" print
ERROR info that should not be printed when "src" not exists.
[ https://issues.apache.org/jira/browse/SPARK-6976?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Zhongshuai Pei updated SPARK-6976:
----------------------------------
Summary: "drop table if exists src" print ERROR info that should not be printed when "src" not exists. (was: "drop table if exists src" print ERROR info that should not be printed)
> "drop table if exists src" print ERROR info that should not be printed when "src" not exists.
> ---------------------------------------------------------------------------------------------
>
> Key: SPARK-6976
> URL: https://issues.apache.org/jira/browse/SPARK-6976
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.3.0
> Reporter: Zhongshuai Pei
>
> If table "src" not exists and run sql "drop table if exists src", then some ERROR info will be printed, like that
> {quote}
> 15/04/17 17:09:53 ERROR Hive: NoSuchObjectException(message:default.src table not found)
> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1560)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:601)
> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105)
> at $Proxy10.get_table(Unknown Source)
> {quote}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org