You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Fernando Morales (Jira)" <ji...@apache.org> on 2021/04/26 22:02:00 UTC

[jira] [Updated] (BEAM-11485) Spark test failure: org.apache.beam.sdk.transforms.CombineFnsTest.testComposedCombineNullValues

     [ https://issues.apache.org/jira/browse/BEAM-11485?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Fernando Morales updated BEAM-11485:
------------------------------------
    Resolution: Fixed
        Status: Resolved  (was: Open)

The errors mentioned in this WI's parent were resolved once PR [https://github.com/apache/beam/pull/14483|http://example.com/] was pushed to master.

> Spark test failure: org.apache.beam.sdk.transforms.CombineFnsTest.testComposedCombineNullValues
> -----------------------------------------------------------------------------------------------
>
>                 Key: BEAM-11485
>                 URL: https://issues.apache.org/jira/browse/BEAM-11485
>             Project: Beam
>          Issue Type: Sub-task
>          Components: runner-spark, test-failures
>            Reporter: Tyson Hamilton
>            Priority: P1
>              Labels: flake, portability-spark
>
> h1.  
> From: [https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Streaming/466/testReport/org.apache.beam.sdk.transforms/CombineFnsTest/testComposedCombineNullValues/]
>  
> {code:java}
> Regression
> org.apache.beam.sdk.transforms.CombineFnsTest.testComposedCombineNullValues
> Failing for the past 1 build (Since #466 ) Took 41 sec.   Error Message
> java.lang.AssertionError: Expected 1 successful assertions, but found 0. Expected: is <1L> but: was <0L>
> Stacktrace
> java.lang.AssertionError: Expected 1 successful assertions, but found 0. Expected: is <1L> but: was <0L> at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18) at org.apache.beam.sdk.testing.TestPipeline.verifyPAssertsSucceeded(TestPipeline.java:516) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:354) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:334) at org.apache.beam.sdk.transforms.CombineFnsTest.testComposedCombineNullValues(CombineFnsTest.java:254) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.rules.ExpectedException$ExpectedExceptionStatement.evaluate(ExpectedException.java:266) at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:322) at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:305) at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:365) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63) at org.junit.runners.ParentRunner$4.run(ParentRunner.java:330) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:78) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:328) at org.junit.runners.ParentRunner.access$100(ParentRunner.java:65) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:292) at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:305) at org.junit.runners.ParentRunner.run(ParentRunner.java:412) at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110) at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58) at org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38) at org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62) at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51) at sun.reflect.GeneratedMethodAccessor155.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36) at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24) at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33) at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94) at com.sun.proxy.$Proxy2.processTestClass(Unknown Source) at org.gradle.api.internal.tasks.testing.worker.TestWorker.processTestClass(TestWorker.java:119) at sun.reflect.GeneratedMethodAccessor154.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36) at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24) at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:182) at org.gradle.internal.remote.internal.hub.MessageHubBackedObjectConnection$DispatchWrapper.dispatch(MessageHubBackedObjectConnection.java:164) at org.gradle.internal.remote.internal.hub.MessageHub$Handler.run(MessageHub.java:414) at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64) at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56) at java.lang.Thread.run(Thread.java:748)
> Standard Error
> 20/12/17 00:25:03 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingService started on localhost:39215 20/12/17 00:25:03 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Java ExpansionService started on localhost:34547 20/12/17 00:25:03 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobService started on localhost:36361 20/12/17 00:25:07 INFO org.apache.beam.runners.portability.PortableRunner: Using job server endpoint: localhost:36361 20/12/17 00:25:07 INFO org.apache.beam.runners.portability.PortableRunner: PrepareJobResponse: preparation_id: "combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_d818a0e3-681f-4c5a-9f67-915fa230821a" artifact_staging_endpoint { url: "localhost:39215" } staging_session_token: "combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_d818a0e3-681f-4c5a-9f67-915fa230821a" 20/12/17 00:25:07 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_d818a0e3-681f-4c5a-9f67-915fa230821a. 20/12/17 00:25:07 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_d818a0e3-681f-4c5a-9f67-915fa230821a.EMBEDDED. 20/12/17 00:25:07 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 313 artifacts for combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_d818a0e3-681f-4c5a-9f67-915fa230821a.null. 20/12/17 00:25:09 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_d818a0e3-681f-4c5a-9f67-915fa230821a. Dec 17, 2020 12:25:09 AM org.apache.beam.vendor.grpc.v1p26p0.io.grpc.netty.NettyServerHandler onStreamError WARNING: Stream Error org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.Http2Exception$StreamException: Received DATA frame for an unknown stream 3 at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.Http2Exception.streamError(Http2Exception.java:147) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.DefaultHttp2ConnectionDecoder$FrameReadListener.shouldIgnoreHeadersOrDataFrame(DefaultHttp2ConnectionDecoder.java:591) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.DefaultHttp2ConnectionDecoder$FrameReadListener.onDataRead(DefaultHttp2ConnectionDecoder.java:239) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.Http2InboundFrameLogger$1.onDataRead(Http2InboundFrameLogger.java:48) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.DefaultHttp2FrameReader.readDataFrame(DefaultHttp2FrameReader.java:422) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.DefaultHttp2FrameReader.processPayloadState(DefaultHttp2FrameReader.java:251) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.DefaultHttp2FrameReader.readFrame(DefaultHttp2FrameReader.java:160) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.Http2InboundFrameLogger.readFrame(Http2InboundFrameLogger.java:41) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.DefaultHttp2ConnectionDecoder.decodeFrame(DefaultHttp2ConnectionDecoder.java:174) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.Http2ConnectionHandler$FrameDecoder.decode(Http2ConnectionHandler.java:378) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.http2.Http2ConnectionHandler.decode(Http2ConnectionHandler.java:438) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:505) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:444) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:283) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1422) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:931) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:700) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:635) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:552) 20/12/17 00:25:09 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453 at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:514) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at org.apache.beam.vendor.grpc.v1p26p0.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:748) 20/12/17 00:25:09 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453 20/12/17 00:25:09 INFO org.apache.beam.runners.portability.PortableRunner: RunJobResponse: job_id: "combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453" 20/12/17 00:25:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 313 files. (Enable logging at DEBUG level to see which files will be staged.) 20/12/17 00:25:09 INFO org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand new Spark Context. 20/12/17 00:25:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453 on Spark master local[4] 20/12/17 00:25:09 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453 on Spark master local[4] 20/12/17 00:25:10 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453: Pipeline translated successfully. Computing outputs 20/12/17 00:25:10 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:10 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:10 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:10 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:11 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:11 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:11 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:11 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:12 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:12 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:12 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:12 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:13 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:13 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:13 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:13 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:14 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:14 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:14 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:14 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:15 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:15 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:15 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:15 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:16 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:16 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:16 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:16 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:17 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:17 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:17 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:17 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:18 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:18 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:18 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:18 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:19 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:19 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:19 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:19 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:20 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:20 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:20 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:20 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:21 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:21 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:21 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:21 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:22 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:22 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:22 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:22 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:23 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:23 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:23 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:23 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:24 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:24 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:24 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:24 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:25 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:25 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:25 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:25 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:26 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:26 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:26 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:26 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:27 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:27 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:27 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:27 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:28 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:28 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:28 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:28 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:29 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:29 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:29 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:29 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:30 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:30 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:30 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:30 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:31 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:31 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:31 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:31 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:32 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:32 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:32 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:32 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:33 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:33 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:33 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:33 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:34 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:34 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:34 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:34 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:35 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:35 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:35 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:35 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:36 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:36 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:36 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:36 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:37 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:37 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:37 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:37 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:38 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:38 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:38 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:38 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:39 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:39 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:39 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:39 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:40 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:40 WARN org.apache.spark.streaming.dstream.QueueInputDStream: queueStream doesn't support checkpointing 20/12/17 00:25:40 WARN org.apache.spark.streaming.util.BatchedWriteAheadLog: BatchedWriteAheadLog Writer queue interrupted. Exception in thread "streaming-job-executor-0" java.lang.Error: java.lang.InterruptedException at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1155) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:206) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:222) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:157) at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:243) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:750) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126) at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:972) at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:970) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:385) at org.apache.spark.rdd.RDD.foreach(RDD.scala:970) at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:351) at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45) at org.apache.beam.runners.spark.translation.streaming.UnboundedDataset.lambda$action$e3b46054$1(UnboundedDataset.java:79) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272) at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628) at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ... 2 more 20/12/17 00:25:42 ERROR org.apache.spark.util.Utils: Aborting task java.io.IOException: Failed to connect to localhost/127.0.0.1:33183 at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:245) at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:187) at org.apache.spark.rpc.netty.NettyRpcEnv.org$apache$spark$rpc$netty$NettyRpcEnv$$downloadClient(NettyRpcEnv.scala:368) at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$openChannel$1.apply$mcV$sp(NettyRpcEnv.scala:336) at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$openChannel$1.apply(NettyRpcEnv.scala:335) at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$openChannel$1.apply(NettyRpcEnv.scala:335) at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394) at org.apache.spark.rpc.netty.NettyRpcEnv.openChannel(NettyRpcEnv.scala:339) at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:693) at org.apache.spark.util.Utils$.fetchFile(Utils.scala:509) at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:816) at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:808) at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap.foreach(HashMap.scala:130) at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732) at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:808) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:375) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:33183 Caused by: java.net.ConnectException: Connection refused at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716) at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:330) at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:702) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:748) 20/12/17 00:25:42 WARN org.apache.spark.util.Utils: Suppressing exception in catch: null java.lang.NullPointerException at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1402) at org.apache.spark.rpc.netty.NettyRpcEnv.openChannel(NettyRpcEnv.scala:339) at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:693) at org.apache.spark.util.Utils$.fetchFile(Utils.scala:509) at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:816) at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:808) at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap.foreach(HashMap.scala:130) at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732) at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:808) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:375) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 20/12/17 00:25:42 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0): Failed to connect to localhost/127.0.0.1:33183 20/12/17 00:25:42 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453 finished. 20/12/17 00:25:42 WARN org.apache.spark.streaming.StreamingContext: StreamingContext has already been stopped 20/12/17 00:25:43 ERROR org.apache.spark.executor.Executor: Exception in task 3.0 in stage 0.0 (TID 3): null 20/12/17 00:25:42 ERROR org.apache.spark.executor.Executor: Exception in task 2.0 in stage 0.0 (TID 2): Cannot retrieve files with 'spark' scheme without an active SparkEnv. 20/12/17 00:25:43 ERROR org.apache.spark.executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1): null 20/12/17 00:25:45 INFO org.apache.beam.runners.jobsubmission.InMemoryJobService: Getting job metrics for combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453 20/12/17 00:25:45 INFO org.apache.beam.runners.jobsubmission.InMemoryJobService: Finished getting job metrics for combinefnstest0testcomposedcombinenullvalues-jenkins-1217002507-53b86688_01ebc07f-41a4-4913-b308-f1eeb377e453 20/12/17 00:25:45 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: JobServer stopped on localhost:36361 20/12/17 00:25:45 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: ArtifactStagingServer stopped on localhost:39215 20/12/17 00:25:45 INFO org.apache.beam.runners.jobsubmission.JobServerDriver: Expansion stopped on localhost:34547
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)