You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ted Yu (JIRA)" <ji...@apache.org> on 2014/08/09 02:46:11 UTC

[jira] [Commented] (SPARK-2706) Enable Spark to support Hive 0.13

    [ https://issues.apache.org/jira/browse/SPARK-2706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14091514#comment-14091514 ] 

Ted Yu commented on SPARK-2706:
-------------------------------

Running Hive test, I got:
{code}
^[[31m*** RUN ABORTED ***^[[0m
^[[31m  java.lang.RuntimeException: Unable to load a Suite class that was discovered in the runpath: org.apache.spark.sql.hive.CachedTableSuite^[[0m
^[[31m  at org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:84)^[[0m
^[[31m  at org.scalatest.tools.DiscoverySuite$$anonfun$1.apply(DiscoverySuite.scala:38)^[[0m
^[[31m  at org.scalatest.tools.DiscoverySuite$$anonfun$1.apply(DiscoverySuite.scala:37)^[[0m
^[[31m  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)^[[0m
^[[31m  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)^[[0m
^[[31m  at scala.collection.Iterator$class.foreach(Iterator.scala:727)^[[0m
^[[31m  at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)^[[0m
^[[31m  at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)^[[0m
^[[31m  at scala.collection.AbstractIterable.foreach(Iterable.scala:54)^[[0m
^[[31m  at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)^[[0m
^[[31m  ...^[[0m
^[[31m  Cause: org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException [Error 10072]: Database does not exist: default^[[0m
^[[31m  at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:296)^[[0m
^[[31m  at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:266)^[[0m
^[[31m  at org.apache.spark.sql.hive.test.TestHiveContext.runSqlHive(TestHive.scala:81)^[[0m
^[[31m  at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)^[[0m
^[[31m  at org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)^[[0m
^[[31m  at org.apache.spark.sql.hive.HiveContext$QueryExecution.stringResult(HiveContext.scala:405)^[[0m
^[[31m  at org.apache.spark.sql.hive.test.TestHiveContext$SqlCmd$$anonfun$cmd$1.apply$mcV$sp(TestHive.scala:161)^[[0m
^[[31m  at org.apache.spark.sql.hive.test.TestHiveContext$$anonfun$loadTestTable$2.apply(TestHive.scala:279)^[[0m
^[[31m  at org.apache.spark.sql.hive.test.TestHiveContext$$anonfun$loadTestTable$2.apply(TestHive.scala:279)^[[0m
^[[31m  at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)^[[0m
^[[31m  ...^[[0m
{code}

> Enable Spark to support Hive 0.13
> ---------------------------------
>
>                 Key: SPARK-2706
>                 URL: https://issues.apache.org/jira/browse/SPARK-2706
>             Project: Spark
>          Issue Type: Dependency upgrade
>          Components: SQL
>    Affects Versions: 1.0.1
>            Reporter: Chunjun Xiao
>         Attachments: hive.diff, spark-2706-v1.txt, spark-2706-v2.txt, spark-hive.err
>
>
> It seems Spark cannot work with Hive 0.13 well.
> When I compiled Spark with Hive 0.13.1, I got some error messages, as attached below.
> So, when can Spark be enabled to support Hive 0.13?
> Compiling Error:
> {quote}
> [ERROR] /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala:180: type mismatch;
>  found   : String
>  required: Array[String]
> [ERROR]       val proc: CommandProcessor = CommandProcessorFactory.get(tokens(0), hiveconf)
> [ERROR]                                                                      ^
> [ERROR] /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala:264: overloaded method constructor TableDesc with alternatives:
>   (x$1: Class[_ <: org.apache.hadoop.mapred.InputFormat[_, _]],x$2: Class[_],x$3: java.util.Properties)org.apache.hadoop.hive.ql.plan.TableDesc <and>
>   ()org.apache.hadoop.hive.ql.plan.TableDesc
>  cannot be applied to (Class[org.apache.hadoop.hive.serde2.Deserializer], Class[(some other)?0(in value tableDesc)(in value tableDesc)], Class[?0(in value tableDesc)(in value tableDesc)], java.util.Properties)
> [ERROR]   val tableDesc = new TableDesc(
> [ERROR]                   ^
> [ERROR] /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:140: value getPartitionPath is not a member of org.apache.hadoop.hive.ql.metadata.Partition
> [ERROR]       val partPath = partition.getPartitionPath
> [ERROR]                                ^
> [ERROR] /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/HiveTableScan.scala:132: value appendReadColumnNames is not a member of object org.apache.hadoop.hive.serde2.ColumnProjectionUtils
> [ERROR]     ColumnProjectionUtils.appendReadColumnNames(hiveConf, attributes.map(_.name))
> [ERROR]                           ^
> [ERROR] /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala:79: org.apache.hadoop.hive.common.type.HiveDecimal does not have a constructor
> [ERROR]       new HiveDecimal(bd.underlying())
> [ERROR]       ^
> [ERROR] /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala:132: type mismatch;
>  found   : org.apache.hadoop.fs.Path
>  required: String
> [ERROR]       SparkHiveHadoopWriter.createPathFromString(fileSinkConf.getDirName, conf))
> [ERROR]                                                               ^
> [ERROR] /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala:179: value getExternalTmpFileURI is not a member of org.apache.hadoop.hive.ql.Context
> [ERROR]     val tmpLocation = hiveContext.getExternalTmpFileURI(tableLocation)
> [ERROR]                                   ^
> [ERROR] /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUdfs.scala:209: org.apache.hadoop.hive.common.type.HiveDecimal does not have a constructor
> [ERROR]           case bd: BigDecimal => new HiveDecimal(bd.underlying())
> [ERROR]                                  ^
> [ERROR] 8 errors found
> [DEBUG] Compilation failed (CompilerInterface)
> [INFO] ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO] 
> [INFO] Spark Project Parent POM .......................... SUCCESS [2.579s]
> [INFO] Spark Project Core ................................ SUCCESS [2:39.805s]
> [INFO] Spark Project Bagel ............................... SUCCESS [21.148s]
> [INFO] Spark Project GraphX .............................. SUCCESS [59.950s]
> [INFO] Spark Project ML Library .......................... SUCCESS [1:08.771s]
> [INFO] Spark Project Streaming ........................... SUCCESS [1:17.759s]
> [INFO] Spark Project Tools ............................... SUCCESS [15.405s]
> [INFO] Spark Project Catalyst ............................ SUCCESS [1:17.405s]
> [INFO] Spark Project SQL ................................. SUCCESS [1:11.094s]
> [INFO] Spark Project Hive ................................ FAILURE [11.121s]
> [INFO] Spark Project REPL ................................ SKIPPED
> [INFO] Spark Project YARN Parent POM ..................... SKIPPED
> [INFO] Spark Project YARN Stable API ..................... SKIPPED
> [INFO] Spark Project Assembly ............................ SKIPPED
> [INFO] Spark Project External Twitter .................... SKIPPED
> [INFO] Spark Project External Kafka ...................... SKIPPED
> [INFO] Spark Project External Flume ...................... SKIPPED
> [INFO] Spark Project External ZeroMQ ..................... SKIPPED
> [INFO] Spark Project External MQTT ....................... SKIPPED
> [INFO] Spark Project Examples ............................ SKIPPED
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 9:25.609s
> [INFO] Finished at: Wed Jul 23 05:22:06 EDT 2014
> [INFO] Final Memory: 52M/873M
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.1.6:compile (scala-compile-first) on project spark-hive_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.1.6:compile failed. CompileFailed -> [Help 1]
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.1.6:compile (scala-compile-first) on project spark-hive_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.1.6:compile failed.
> 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:225)
> 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
> 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
> 	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
> 	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
> 	at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
> 	at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
> 	at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
> 	at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
> 	at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
> 	at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
> 	at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
> 	at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
> 	at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
> 	at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
> Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.1.6:compile failed.
> 	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:110)
> 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
> 	... 19 more
> Caused by: Compilation failed
> 	at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:76)
> 	at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:35)
> 	at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:29)
> 	at sbt.compiler.AggressiveCompile$$anonfun$4$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:71)
> 	at sbt.compiler.AggressiveCompile$$anonfun$4$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:71)
> 	at sbt.compiler.AggressiveCompile$$anonfun$4$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:71)
> 	at sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:101)
> 	at sbt.compiler.AggressiveCompile$$anonfun$4.compileScala$1(AggressiveCompile.scala:70)
> 	at sbt.compiler.AggressiveCompile$$anonfun$4.apply(AggressiveCompile.scala:88)
> 	at sbt.compiler.AggressiveCompile$$anonfun$4.apply(AggressiveCompile.scala:60)
> 	at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:24)
> 	at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:22)
> 	at sbt.inc.Incremental$.cycle(Incremental.scala:52)
> 	at sbt.inc.Incremental$.compile(Incremental.scala:29)
> 	at sbt.inc.IncrementalCompile$.apply(Compile.scala:20)
> 	at sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:96)
> 	at sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:44)
> 	at com.typesafe.zinc.Compiler.compile(Compiler.scala:158)
> 	at com.typesafe.zinc.Compiler.compile(Compiler.scala:142)
> 	at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:77)
> 	at scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:308)
> 	at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:124)
> 	at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:104)
> 	at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)
> 	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
> 	... 20 more
> [ERROR] 
> {quote}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org