You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/07/13 06:07:55 UTC

Build failed in Jenkins: beam_PostCommit_Python_VR_Spark #3395

See <https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/3395/display/redirect>

Changes:


------------------------------------------
[...truncated 2.64 MB...]
    delayed_application = self.dofn_runner.process_with_sized_restriction(o)
  File "apache_beam/runners/common.py", line 977, in process_with_sized_restriction
    watermark_estimator_state=estimator_state)
  File "apache_beam/runners/common.py", line 717, in invoke_process
    windowed_value, additional_args, additional_kwargs)
  File "apache_beam/runners/common.py", line 824, in _invoke_process_per_window
    self.threadsafe_restriction_tracker.check_done()
  File "apache_beam/runners/sdf_utils.py", line 115, in check_done
    return self._restriction_tracker.check_done()
  File "apache_beam/io/restriction_trackers.py", line 101, in check_done
    self._range.stop))
ValueError: OffsetRestrictionTracker is not done since work in range [0, 6) has not been claimed.

INFO:apache_beam.runners.portability.portable_runner:Job state changed to FAILED
.sssINFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:37553
WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.24.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fa351f6df50> ====================
20/07/13 06:07:50 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_7d931a81-d779-4a4f-9eed-79160f5236e0.
20/07/13 06:07:50 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_7d931a81-d779-4a4f-9eed-79160f5236e0.ref_Environment_default_environment_1.
20/07/13 06:07:50 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 0 artifacts for job_7d931a81-d779-4a4f-9eed-79160f5236e0.null.
20/07/13 06:07:50 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_7d931a81-d779-4a4f-9eed-79160f5236e0.
20/07/13 06:07:50 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowed_pardo_state_timers_1594620470.3_57095b67-9fe5-42a9-8b81-a28a4abd7a41
20/07/13 06:07:50 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation test_windowed_pardo_state_timers_1594620470.3_57095b67-9fe5-42a9-8b81-a28a4abd7a41
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
20/07/13 06:07:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
20/07/13 06:07:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 7 files. (Enable logging at DEBUG level to see which files will be staged.)
20/07/13 06:07:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowed_pardo_state_timers_1594620470.3_57095b67-9fe5-42a9-8b81-a28a4abd7a41 on Spark master local
20/07/13 06:07:51 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
20/07/13 06:07:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1594620470.3_57095b67-9fe5-42a9-8b81-a28a4abd7a41: Pipeline translated successfully. Computing outputs
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:43703.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
20/07/13 06:07:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 32-1
20/07/13 06:07:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 32-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:36999.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:37643
20/07/13 06:07:51 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
20/07/13 06:07:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 32-3
20/07/13 06:07:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 32-4
20/07/13 06:07:51 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 32-5
20/07/13 06:07:51 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowed_pardo_state_timers_1594620470.3_57095b67-9fe5-42a9-8b81-a28a4abd7a41 finished.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.20/07/13 06:07:51 INFO org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Closing environment urn: "beam:env:external:v1"
payload: "\n\021\n\017localhost:35673"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.24.0.dev"

20/07/13 06:07:51 WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer: Hanged up for unknown endpoint.
20/07/13 06:07:51 WARN org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory: Error cleaning up servers urn: "beam:env:external:v1"
payload: "\n\021\n\017localhost:35673"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:timer:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:param_windowed_value:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:worker_status:v1"
capabilities: "beam:version:sdk_base:apache/beam_python2.7_sdk:2.24.0.dev"

org.apache.beam.vendor.grpc.v1p26p0.io.grpc.StatusRuntimeException: UNAVAILABLE: io exception
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:240)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:221)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:140)
	at org.apache.beam.model.fnexecution.v1.BeamFnExternalWorkerPoolGrpc$BeamFnExternalWorkerPoolBlockingStub.stopWorker(BeamFnExternalWorkerPoolGrpc.java:247)
	at org.apache.beam.runners.fnexecution.environment.ExternalEnvironmentFactory$1.close(ExternalEnvironmentFactory.java:159)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DefaultJobBundleFactory.java:629)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.close(DefaultJobBundleFactory.java:629)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.unref(DefaultJobBundleFactory.java:645)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$WrappedSdkHarnessClient.access$400(DefaultJobBundleFactory.java:576)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.lambda$createEnvironmentCaches$3(DefaultJobBundleFactory.java:208)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1809)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3462)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3438)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3215)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.clear(LocalCache.java:4270)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4909)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.close(DefaultJobBundleFactory.java:315)
	at org.apache.beam.runners.fnexecution.control.DefaultExecutableStageContext.close(DefaultExecutableStageContext.java:43)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.closeActual(ReferenceCountingExecutableStageContextFactory.java:208)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory$WrappedContext.access$200(ReferenceCountingExecutableStageContextFactory.java:184)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.release(ReferenceCountingExecutableStageContextFactory.java:173)
	at org.apache.beam.runners.fnexecution.control.ReferenceCountingExecutableStageContextFactory.lambda$scheduleRelease$1(ReferenceCountingExecutableStageContextFactory.java:127)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: localhost/127.0.0.1:35673
Caused by: java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:714)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:327)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:688)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:635)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:552)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:514)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at org.apache.beam.vendor.grpc.v1p26p0.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.lang.Thread.run(Thread.java:748)
INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight requests to complete
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 706, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1594620471.785150318","description":"Error received from peer ipv4:127.0.0.1:42933","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state handlers.
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "apache_beam/runners/worker/data_plane.py", line 545, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 706, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1594620471.785150318","description":"Error received from peer ipv4:127.0.0.1:42933","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:46739
WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.24.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7fa351f6df50> ====================
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Staging artifacts for job_55f19f74-a7a7-4ad0-bc27-4bc4d545b05e.
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Resolving artifacts for job_55f19f74-a7a7-4ad0-bc27-4bc4d545b05e.ref_Environment_default_environment_1.
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Getting 0 artifacts for job_55f19f74-a7a7-4ad0-bc27-4bc4d545b05e.null.
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.artifact.ArtifactStagingService: Artifacts fully staged for job_55f19f74-a7a7-4ad0-bc27-4bc4d545b05e.
20/07/13 06:07:52 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking job test_windowing_1594620471.6_d644bbcc-4e8d-41c3-82db-f9213edeb396
20/07/13 06:07:52 INFO org.apache.beam.runners.jobsubmission.JobInvocation: Starting job invocation test_windowing_1594620471.6_d644bbcc-4e8d-41c3-82db-f9213edeb396
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
20/07/13 06:07:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath
20/07/13 06:07:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will stage 7 files. (Enable logging at DEBUG level to see which files will be staged.)
20/07/13 06:07:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Running job test_windowing_1594620471.6_d644bbcc-4e8d-41c3-82db-f9213edeb396 on Spark master local
20/07/13 06:07:52 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not consistent with equals. That might cause issues on some runners.
20/07/13 06:07:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1594620471.6_d644bbcc-4e8d-41c3-82db-f9213edeb396: Pipeline translated successfully. Computing outputs
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:46697.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam Fn Control client connected with id 33-1
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 33-2
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:43385.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:45885
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client connected.
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 33-3
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 33-4
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 33-5
20/07/13 06:07:52 INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: getProcessBundleDescriptor request with id 33-6
20/07/13 06:07:52 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job test_windowing_1594620471.6_d644bbcc-4e8d-41c3-82db-f9213edeb396 finished.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.
----------------------------------------------------------------------
Ran 42 tests in 71.886s

OK (skipped=10)

> Task :sdks:python:test-suites:portable:py2:sparkCompatibilityMatrixLoopback
> Task :sdks:python:test-suites:portable:py2:sparkValidatesRunner

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 148

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:createProcessWorker'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 29s
75 actionable tasks: 60 executed, 15 from cache

Publishing build scan...
https://gradle.com/s/2y33frnzhlyre

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_VR_Spark #3396

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/3396/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org