You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Masf <ma...@gmail.com> on 2015/03/27 10:45:51 UTC

Error in Delete Table

Hi.

In HiveContext, when I put this statement "DROP TABLE IF EXISTS TestTable"
If TestTable doesn't exist, spark returns an error:



ERROR Hive: NoSuchObjectException(message:default.TestTable table not found)
at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
at
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1008)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
at com.sun.proxy.$Proxy22.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1000)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:942)
at
org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3887)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:310)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1554)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1321)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1139)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:962)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:952)
at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
at
org.apache.spark.sql.hive.execution.DropTable.sideEffectResult$lzycompute(commands.scala:58)
at
org.apache.spark.sql.hive.execution.DropTable.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
at org.apache.spark.sql.hive.execution.DropTable.execute(commands.scala:51)
at
org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
at
org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
at GeoMain$$anonfun$HiveExecution$1.apply(GeoMain.scala:98)
at GeoMain$$anonfun$HiveExecution$1.apply(GeoMain.scala:98)
at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at GeoMain$.HiveExecution(GeoMain.scala:96)
at GeoMain$.main(GeoMain.scala:17)
at GeoMain.main(GeoMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


Thanks!!
-- 


Regards.
Miguel Ángel

Re: Error in Delete Table

Posted by Masf <ma...@gmail.com>.
Hi Ted.

Spark 1.2.0 an Hive 0.13.1

Regards.
Miguel Angel.


On Tue, Mar 31, 2015 at 10:37 AM, Ted Yu <yu...@gmail.com> wrote:

> Which Spark and Hive release are you using ?
>
> Thanks
>
>
>
> > On Mar 27, 2015, at 2:45 AM, Masf <ma...@gmail.com> wrote:
> >
> > Hi.
> >
> > In HiveContext, when I put this statement "DROP TABLE IF EXISTS
> TestTable"
> > If TestTable doesn't exist, spark returns an error:
> >
> >
> >
> > ERROR Hive: NoSuchObjectException(message:default.TestTable table not
> found)
> >       at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> >       at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> >       at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> >       at
> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> >       at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> >       at
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> >       at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1008)
> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >       at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >       at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >       at java.lang.reflect.Method.invoke(Method.java:606)
> >       at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
> >       at com.sun.proxy.$Proxy22.getTable(Unknown Source)
> >       at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1000)
> >       at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:942)
> >       at
> org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3887)
> >       at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:310)
> >       at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
> >       at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
> >       at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1554)
> >       at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1321)
> >       at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1139)
> >       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:962)
> >       at org.apache.hadoop.hive.ql.Driver.run(Driver.java:952)
> >       at
> org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
> >       at
> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
> >       at
> org.apache.spark.sql.hive.execution.DropTable.sideEffectResult$lzycompute(commands.scala:58)
> >       at
> org.apache.spark.sql.hive.execution.DropTable.sideEffectResult(commands.scala:56)
> >       at
> org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
> >       at
> org.apache.spark.sql.hive.execution.DropTable.execute(commands.scala:51)
> >       at
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
> >       at
> org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
> >       at
> org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
> >       at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> >       at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> >       at GeoMain$$anonfun$HiveExecution$1.apply(GeoMain.scala:98)
> >       at GeoMain$$anonfun$HiveExecution$1.apply(GeoMain.scala:98)
> >       at
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> >       at
> scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
> >       at GeoMain$.HiveExecution(GeoMain.scala:96)
> >       at GeoMain$.main(GeoMain.scala:17)
> >       at GeoMain.main(GeoMain.scala)
> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >       at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >       at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >       at java.lang.reflect.Method.invoke(Method.java:606)
> >       at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
> >       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> >       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >
> >
> > Thanks!!
> > --
> >
> >
> > Regards.
> > Miguel Ángel
>



-- 


Saludos.
Miguel Ángel

Re: Error in Delete Table

Posted by Ted Yu <yu...@gmail.com>.
Which Spark and Hive release are you using ?

Thanks



> On Mar 27, 2015, at 2:45 AM, Masf <ma...@gmail.com> wrote:
> 
> Hi.
> 
> In HiveContext, when I put this statement "DROP TABLE IF EXISTS TestTable"
> If TestTable doesn't exist, spark returns an error:
> 
> 
> 
> ERROR Hive: NoSuchObjectException(message:default.TestTable table not found)
> 	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> 	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> 	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> 	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> 	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> 	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1008)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
> 	at com.sun.proxy.$Proxy22.getTable(Unknown Source)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1000)
> 	at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:942)
> 	at org.apache.hadoop.hive.ql.exec.DDLTask.dropTableOrPartitions(DDLTask.java:3887)
> 	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:310)
> 	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
> 	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
> 	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1554)
> 	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1321)
> 	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1139)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:962)
> 	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:952)
> 	at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
> 	at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:276)
> 	at org.apache.spark.sql.hive.execution.DropTable.sideEffectResult$lzycompute(commands.scala:58)
> 	at org.apache.spark.sql.hive.execution.DropTable.sideEffectResult(commands.scala:56)
> 	at org.apache.spark.sql.execution.Command$class.execute(commands.scala:46)
> 	at org.apache.spark.sql.hive.execution.DropTable.execute(commands.scala:51)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:425)
> 	at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:425)
> 	at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
> 	at org.apache.spark.sql.SchemaRDD.<init>(SchemaRDD.scala:108)
> 	at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:94)
> 	at GeoMain$$anonfun$HiveExecution$1.apply(GeoMain.scala:98)
> 	at GeoMain$$anonfun$HiveExecution$1.apply(GeoMain.scala:98)
> 	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
> 	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
> 	at GeoMain$.HiveExecution(GeoMain.scala:96)
> 	at GeoMain$.main(GeoMain.scala:17)
> 	at GeoMain.main(GeoMain.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 
> 
> Thanks!!
> -- 
> 
> 
> Regards.
> Miguel Ángel

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org