You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2015/06/18 22:12:01 UTC
[jira] [Comment Edited] (SPARK-8410) Hive VersionsSuite
RuntimeException
[ https://issues.apache.org/jira/browse/SPARK-8410?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14592419#comment-14592419 ]
Michael Armbrust edited comment on SPARK-8410 at 6/18/15 8:11 PM:
------------------------------------------------------------------
Does the maven integration work for you in other places? Like when launching the spark shell?
Details under Advanced Dependency Management:
https://spark.apache.org/docs/latest/submitting-applications.html
was (Author: marmbrus):
Does the maven integration work for you in other places? Like when launching the spark shell?
> Hive VersionsSuite RuntimeException
> -----------------------------------
>
> Key: SPARK-8410
> URL: https://issues.apache.org/jira/browse/SPARK-8410
> Project: Spark
> Issue Type: Question
> Components: SQL
> Affects Versions: 1.3.1, 1.4.0
> Environment: IBM Power system - P7
> running Ubuntu 14.04LE
> with IBM JDK version 1.7.0
> Reporter: Josiah Samuel Sathiadass
> Priority: Minor
>
> While testing Spark Project Hive, there are RuntimeExceptions as follows,
> VersionsSuite:
> - success sanity check *** FAILED ***
> java.lang.RuntimeException: [download failed: org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed: org.codehaus.groovy#groovy-all;2.1.6!groovy-all.jar, download failed: asm#asm;3.2!asm.jar]
> at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978)
> at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
> at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62)
> at org.apache.spark.sql.catalyst.util.package$.quietly(package.scala:38)
> at org.apache.spark.sql.hive.client.IsolatedClientLoader$.org$apache$spark$sql$hive$client$IsolatedClientLoader$$downloadVersion(IsolatedClientLoader.scala:61)
> at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$1.apply(IsolatedClientLoader.scala:44)
> at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$1.apply(IsolatedClientLoader.scala:44)
> at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:189)
> at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:91)
> at org.apache.spark.sql.hive.client.IsolatedClientLoader$.forVersion(IsolatedClientLoader.scala:44)
> ...
> The tests are executed with the following set of options,
> build/mvn --pl sql/hive --fail-never -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 test
> Adding the following dependencies in the "spark/sql/hive/pom.xml" file solves this issue,
> < <dependency>
> < <groupId>org.jboss.netty</groupId>
> < <artifactId>netty</artifactId>
> < <version>3.2.2.Final</version>
> < <scope>test</scope>
> < </dependency>
> < <dependency>
> < <groupId>org.codehaus.groovy</groupId>
> < <artifactId>groovy-all</artifactId>
> < <version>2.1.6</version>
> < <scope>test</scope>
> < </dependency>
> <
> < <dependency>
> < <groupId>asm</groupId>
> < <artifactId>asm</artifactId>
> < <version>3.2</version>
> < <scope>test</scope>
> < </dependency>
> <
> The question is, Is this the correct way to fix this runtimeException ?
> If yes, Can a pull request fix this issue permanently ?
> If not, suggestions please.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org