You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/06/04 16:26:31 UTC
[GitHub] [beam] damccorm opened a new issue, #20293: Spark Runner Tests failing [Java 11]
damccorm opened a new issue, #20293:
URL: https://github.com/apache/beam/issues/20293
Gradle task *_:runners:spark:test_* fails during Java 11 Precommit job
Example stack trace:
```
> Task :runners:spark:test
20/05/26 07:26:31 INFO org.apache.beam.runners.spark.metrics.MetricsAccumulator:
Instantiated metrics accumulator: {
"metrics": {
}
}
org.apache.beam.runners.spark.structuredstreaming.StructuredStreamingPipelineStateTest
> testBatchPipelineRunningState STANDARD_ERROR
20/05/26 07:26:32 INFO org.apache.beam.runners.spark.structuredstreaming.SparkStructuredStreamingRunner:
*** SparkStructuredStreamingRunner is based on spark structured streaming framework and is no more
based on RDD/DStream API. See
https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html
It is still experimental, its coverage of the Beam model is partial. ***
org.apache.beam.runners.spark.SparkPortableExecutionTest
> testExecution STANDARD_ERROR
20/05/26 07:26:33 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions:
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals.
That might cause issues on some runners.
org.apache.beam.runners.spark.structuredstreaming.translation.batch.FlattenTest
> testFlatten STANDARD_ERROR
20/05/26 07:26:34 WARN org.apache.hadoop.util.NativeCodeLoader: Unable
to load native-hadoop library for your platform... using builtin-java classes where applicable
org.apache.beam.runners.spark.SparkPortableExecutionTest
> testExecution STANDARD_ERROR
20/05/26 07:26:34 ERROR org.apache.beam.runners.jobsubmission.JobInvocation:
Error during job invocation fakeId.
java.lang.IllegalArgumentException: Unsupported class file
major version 55
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at
scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:990)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:385)
at org.apache.spark.rdd.RDD.collect(RDD.scala:989)
at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:361)
at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
at org.apache.beam.runners.spark.translation.BoundedDataset.getBytes(BoundedDataset.java:76)
at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInput(SparkBatchPortablePipelineTranslator.java:354)
at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.broadcastSideInputs(SparkBatchPortablePipelineTranslator.java:338)
at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translateExecutableStage(SparkBatchPortablePipelineTranslator.java:216)
at org.apache.beam.runners.spark.translation.SparkBatchPortablePipelineTranslator.translate(SparkBatchPortablePipelineTranslator.java:138)
at org.apache.beam.runners.spark.SparkPipelineRunner.lambda$run$1(SparkPipelineRunner.java:122)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
org.apache.beam.runners.spark.SparkPortableExecutionTest
> testExecution FAILED
java.lang.AssertionError: expected:<DONE> but was:<FAILED>
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failNotEquals(Assert.java:835)
at org.junit.Assert.assertEquals(Assert.java:120)
at org.junit.Assert.assertEquals(Assert.java:146)
at org.apache.beam.runners.spark.SparkPortableExecutionTest.testExecution(SparkPortableExecutionTest.java:159)
org.apache.beam.runners.spark.SparkPortableExecutionTest
> testExecStageWithMultipleOutputs STANDARD_ERROR
20/05/26 07:26:35 INFO org.apache.beam.runners.jobsubmission.JobInvocation:
Starting job invocation testExecStageWithMultipleOutputs
20/05/26 07:26:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner:
PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
20/05/26
07:26:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 289 files. (Enable logging
at DEBUG level to see which files will be staged.)
20/05/26 07:26:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner:
Running job testExecStageWithMultipleOutputs on Spark master local[4]
20/05/26 07:26:37 INFO org.apache.beam.runners.spark.SparkPipelineRunner:
Job testExecStageWithMultipleOutputs: Pipeline translated successfully. Computing outputs
Gradle
Test Executor 114 started executing tests.
```
Imported from Jira [BEAM-10083](https://issues.apache.org/jira/browse/BEAM-10083). Original Jira may contain additional context.
Reported by: pawel.pasterz.
Subtask of issue #20290
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: github-unsubscribe@beam.apache.org.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
Re: [I] Spark Runner Tests failing [Java 11] [beam]
Posted by "Abacn (via GitHub)" <gi...@apache.org>.
Abacn closed issue #20293: Spark Runner Tests failing [Java 11]
URL: https://github.com/apache/beam/issues/20293
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: github-unsubscribe@beam.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org