You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Zoltan Haindrich (JIRA)" <ji...@apache.org> on 2018/10/15 11:55:00 UTC

[jira] [Created] (HIVE-20747) Running a single spark tests alone (spark_explainuser_1) results in NoSuchMethodError

Zoltan Haindrich created HIVE-20747:
---------------------------------------

             Summary: Running a single spark tests alone (spark_explainuser_1) results in NoSuchMethodError
                 Key: HIVE-20747
                 URL: https://issues.apache.org/jira/browse/HIVE-20747
             Project: Hive
          Issue Type: Bug
          Components: Spark, Tests
            Reporter: Zoltan Haindrich


reproduce:
{code}
time mvn install -Pitests -pl itests/qtest-spark/ -Dtest=TestMiniSparkOnYarnCliDriver#testCliDriver[spark_explainuser_1]  -am 
{code}

I think the actual error is misleading...the real exception in hive.log is:
{code}
2018-10-15T04:44:39,102 ERROR [5bad7b56-dbbe-4868-8006-0aeecf9eb6c3 main] status.SparkJobMonitor: Spark job[1] failed
java.util.concurrent.ExecutionException: Exception thrown by job
        at org.apache.spark.JavaFutureActionWrapper.getImpl(FutureAction.scala:337) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.JavaFutureActionWrapper.get(FutureAction.scala:342) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:404) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
        at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:365) ~[hive-exec-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_181]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_181]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_181]
        at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_181]
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 4, savara.lan, executor 1): java.lang.NoSuchMethodError: com.esotericsoftware.kryo.io.Output.writeVarInt(IZ)I
        at org.apache.hive.spark.HiveKryoRegistrator$HiveKeySerializer.write(HiveKryoRegistrator.java:44)
        at org.apache.hive.spark.HiveKryoRegistrator$HiveKeySerializer.write(HiveKryoRegistrator.java:41)
        at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)
        at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:241)
        at org.apache.spark.serializer.SerializationStream.writeKey(Serializer.scala:132)
        at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:240)
        at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
        at org.apache.spark.scheduler.Task.run(Task.scala:109)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)

Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1599) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1587) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1586) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) ~[scala-library-2.11.8.jar:?]
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) ~[scala-library-2.11.8.jar:?]
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1586) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:831) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at scala.Option.foreach(Option.scala:257) ~[scala-library-2.11.8.jar:?]
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:831) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1820) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1769) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1758) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) ~[spark-core_2.11-2.3.0.jar:2.3.0]
Caused by: java.lang.NoSuchMethodError: com.esotericsoftware.kryo.io.Output.writeVarInt(IZ)I
        at org.apache.hive.spark.HiveKryoRegistrator$HiveKeySerializer.write(HiveKryoRegistrator.java:44) ~[?:?]
        at org.apache.hive.spark.HiveKryoRegistrator$HiveKeySerializer.write(HiveKryoRegistrator.java:41) ~[?:?]
        at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568) ~[kryo-2.21.jar:?]
        at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:241) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.serializer.SerializationStream.writeKey(Serializer.scala:132) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:240) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.scheduler.Task.run(Task.scala:109) ~[spark-core_2.11-2.3.0.jar:2.3.0]
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) ~[spark-core_2.11-2.3.0.jar:2.3.0]
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)