You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Zhan Zhang <zh...@gmail.com> on 2014/08/12 01:17:17 UTC

Spark testsuite error for hive 0.13.

I am trying to change spark to support hive-0.13, but always met following
problem when running the test. My feeling is the test setup may need to
change, but don't know exactly. Who has the similar issue or is able to shed
light on it?

13:50:53.331 ERROR org.apache.hadoop.hive.ql.Driver: FAILED:
SemanticException [Error 10072]: Database does not exist: default
org.apache.hadoop.hive.ql.parse.SemanticException: Database does not exist:
default
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1302)
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1291)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:9944)
        at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9180)
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:391)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:291)
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:944)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1009)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
        at
org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:292)
        at
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:266)
        at
org.apache.spark.sql.hive.test.TestHiveContext.runSqlHive(TestHive.scala:83)
        at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
        at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
        at
org.apache.spark.sql.hive.HiveContext$QueryExecution.stringResult(HiveContext.scala:405)
        at
org.apache.spark.sql.hive.test.TestHiveContext$SqlCmd$$anonfun$cmd$1.apply$mcV$sp(TestHive.scala:164)
        at
org.apache.spark.sql.hive.test.TestHiveContext$$anonfun$loadTestTable$2.apply(TestHive.scala:282)
        at
org.apache.spark.sql.hive.test.TestHiveContext$$anonfun$loadTestTable$2.apply(TestHive.scala:282)
        at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at
scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
        at
org.apache.spark.sql.hive.test.TestHiveContext.loadTestTable(TestHive.scala:282)
        at
org.apache.spark.sql.hive.CachedTableSuite.<init>(CachedTableSuite.scala:28)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at java.lang.Class.newInstance(Class.java:374)
        at
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:621)
        at sbt.ForkMain$Run$2.call(ForkMain.java:294)
        at sbt.ForkMain$Run$2.call(ForkMain.java:284)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: Database does
not exist: default
        at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1298)
        ... 35 more



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-testsuite-error-for-hive-0-13-tp7807.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark testsuite error for hive 0.13.

Posted by Zhan Zhang <zh...@gmail.com>.
Problem solved by a walkaround with create database and use database.



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-testsuite-error-for-hive-0-13-tp7807p7819.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark testsuite error for hive 0.13.

Posted by Zhan Zhang <zh...@gmail.com>.
Thanks Sean,

I change both the API and version because there are some incompatibility
with hive-0.13, and actually can do some basic operation with the real hive
environment. But the test suite always complain with no default database
message. No clue yet.



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-testsuite-error-for-hive-0-13-tp7807p7810.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: Spark testsuite error for hive 0.13.

Posted by Sean Owen <so...@cloudera.com>.
I don't think this will work just by changing the version. Have a look
at: https://issues.apache.org/jira/browse/SPARK-2706

On Tue, Aug 12, 2014 at 12:17 AM, Zhan Zhang <zh...@gmail.com> wrote:
> I am trying to change spark to support hive-0.13, but always met following
> problem when running the test. My feeling is the test setup may need to
> change, but don't know exactly. Who has the similar issue or is able to shed
> light on it?
>
> 13:50:53.331 ERROR org.apache.hadoop.hive.ql.Driver: FAILED:
> SemanticException [Error 10072]: Database does not exist: default
> org.apache.hadoop.hive.ql.parse.SemanticException: Database does not exist:
> default
>         at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1302)
>         at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1291)
>         at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:9944)
>         at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9180)
>         at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:391)
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:291)
>         at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:944)
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1009)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
>         at
> org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:292)
>         at
> org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:266)
>         at
> org.apache.spark.sql.hive.test.TestHiveContext.runSqlHive(TestHive.scala:83)
>         at
> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
>         at
> org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
>         at
> org.apache.spark.sql.hive.HiveContext$QueryExecution.stringResult(HiveContext.scala:405)
>         at
> org.apache.spark.sql.hive.test.TestHiveContext$SqlCmd$$anonfun$cmd$1.apply$mcV$sp(TestHive.scala:164)
>         at
> org.apache.spark.sql.hive.test.TestHiveContext$$anonfun$loadTestTable$2.apply(TestHive.scala:282)
>         at
> org.apache.spark.sql.hive.test.TestHiveContext$$anonfun$loadTestTable$2.apply(TestHive.scala:282)
>         at
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>         at
> scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
>         at
> org.apache.spark.sql.hive.test.TestHiveContext.loadTestTable(TestHive.scala:282)
>         at
> org.apache.spark.sql.hive.CachedTableSuite.<init>(CachedTableSuite.scala:28)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>         at java.lang.Class.newInstance(Class.java:374)
>         at
> org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:621)
>         at sbt.ForkMain$Run$2.call(ForkMain.java:294)
>         at sbt.ForkMain$Run$2.call(ForkMain.java:284)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.hadoop.hive.ql.parse.SemanticException: Database does
> not exist: default
>         at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.getDatabase(BaseSemanticAnalyzer.java:1298)
>         ... 35 more
>
>
>
> --
> View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-testsuite-error-for-hive-0-13-tp7807.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org