You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/07/28 18:19:50 UTC

Build failed in Jenkins: beam_PreCommit_Python2_PVR_Flink_Cron #1190

See <https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/1190/display/redirect?page=changes>

Changes:

[Maximilian Michels] [BEAM-9976] Increase timeout for FlinkSavepointTest

[Maximilian Michels] [BEAM-9976] Add a check to FlinkSavepointTest to ensure no job is

[Maximilian Michels] [BEAM-9976] Remove legacy workaround to retrieve rest address

[Kenneth Knowles] Add log4j configuration to sorter extension

[Kenneth Knowles] Eliminate null errors from :sdks:java:extensions:sorter and enable

[noreply] [BEAM-10580] Eliminate nullability errors from

[noreply] [Beam-10563] Eliminate nullability errors from

[noreply] [BEAM-10562]Eliminate nullability errors from

[noreply] [BEAM-9865] Cleanup Jenkins WS on successful jobs (#12326)


------------------------------------------
[...truncated 2.62 MB...]
	... 4 more
Caused by: java.lang.RuntimeException: Failed to finish remote bundle
	at org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator$SdkHarnessDoFnRunner.finishBundle(ExecutableStageDoFnOperator.java:769)
	at org.apache.beam.runners.flink.metrics.DoFnRunnerWithMetricsUpdate.finishBundle(DoFnRunnerWithMetricsUpdate.java:89)
	at org.apache.beam.runners.core.SimplePushbackSideInputDoFnRunner.finishBundle(SimplePushbackSideInputDoFnRunner.java:124)
	at org.apache.beam.runners.flink.translation.wrappers.streaming.DoFnOperator.invokeFinishBundle(DoFnOperator.java:840)
	at org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator.close(ExecutableStageDoFnOperator.java:489)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.closeAllOperators(StreamTask.java:618)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$afterInvoke$1(StreamTask.java:498)
	at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:94)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.afterInvoke(StreamTask.java:496)
	at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:477)
	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:708)
	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:533)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 5: Traceback (most recent call last):
  File "apache_beam/runners/worker/sdk_worker.py", line 256, in _execute
    response = task()
  File "apache_beam/runners/worker/sdk_worker.py", line 313, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "apache_beam/runners/worker/sdk_worker.py", line 483, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "apache_beam/runners/worker/sdk_worker.py", line 518, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "apache_beam/runners/worker/bundle_processor.py", line 978, in process_bundle
    element.data)
  File "apache_beam/runners/worker/bundle_processor.py", line 218, in process_encoded
    self.output(decoded_value)
  File "apache_beam/runners/worker/operations.py", line 332, in output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 195, in receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 779, in process
    o)
  File "apache_beam/runners/common.py", line 1224, in process_with_sized_restriction
    watermark_estimator_state=estimator_state)
  File "apache_beam/runners/common.py", line 723, in invoke_process
    windowed_value, additional_args, additional_kwargs)
  File "apache_beam/runners/common.py", line 872, in _invoke_process_per_window
    self.threadsafe_restriction_tracker.check_done()
  File "apache_beam/runners/sdf_utils.py", line 115, in check_done
    return self._restriction_tracker.check_done()
  File "apache_beam/io/restriction_trackers.py", line 106, in check_done
    self._range.stop))
ValueError: OffsetRestrictionTracker is not done since work in range [0, 3) has not been claimed.

	at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
	at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
	at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
	at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:493)
	at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:547)
	at org.apache.beam.runners.flink.translation.wrappers.streaming.ExecutableStageDoFnOperator$SdkHarnessDoFnRunner.finishBundle(ExecutableStageDoFnOperator.java:763)
	... 12 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 5: Traceback (most recent call last):
  File "apache_beam/runners/worker/sdk_worker.py", line 256, in _execute
    response = task()
  File "apache_beam/runners/worker/sdk_worker.py", line 313, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "apache_beam/runners/worker/sdk_worker.py", line 483, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "apache_beam/runners/worker/sdk_worker.py", line 518, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "apache_beam/runners/worker/bundle_processor.py", line 978, in process_bundle
    element.data)
  File "apache_beam/runners/worker/bundle_processor.py", line 218, in process_encoded
    self.output(decoded_value)
  File "apache_beam/runners/worker/operations.py", line 332, in output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 195, in receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 779, in process
    o)
  File "apache_beam/runners/common.py", line 1224, in process_with_sized_restriction
    watermark_estimator_state=estimator_state)
  File "apache_beam/runners/common.py", line 723, in invoke_process
    windowed_value, additional_args, additional_kwargs)
  File "apache_beam/runners/common.py", line 872, in _invoke_process_per_window
    self.threadsafe_restriction_tracker.check_done()
  File "apache_beam/runners/sdf_utils.py", line 115, in check_done
    return self._restriction_tracker.check_done()
  File "apache_beam/io/restriction_trackers.py", line 106, in check_done
    self._range.stop))
ValueError: OffsetRestrictionTracker is not done since work in range [0, 3) has not been claimed.

	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:177)
	at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
	at org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more
ERROR:root:java.lang.RuntimeException: Error received from SDK harness for instruction 5: Traceback (most recent call last):
  File "apache_beam/runners/worker/sdk_worker.py", line 256, in _execute
    response = task()
  File "apache_beam/runners/worker/sdk_worker.py", line 313, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "apache_beam/runners/worker/sdk_worker.py", line 483, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "apache_beam/runners/worker/sdk_worker.py", line 518, in process_bundle
    bundle_processor.process_bundle(instruction_id))
  File "apache_beam/runners/worker/bundle_processor.py", line 978, in process_bundle
    element.data)
  File "apache_beam/runners/worker/bundle_processor.py", line 218, in process_encoded
    self.output(decoded_value)
  File "apache_beam/runners/worker/operations.py", line 332, in output
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 195, in receive
    self.consumer.process(windowed_value)
  File "apache_beam/runners/worker/operations.py", line 779, in process
    o)
  File "apache_beam/runners/common.py", line 1224, in process_with_sized_restriction
    watermark_estimator_state=estimator_state)
  File "apache_beam/runners/common.py", line 723, in invoke_process
    windowed_value, additional_args, additional_kwargs)
  File "apache_beam/runners/common.py", line 872, in _invoke_process_per_window
    self.threadsafe_restriction_tracker.check_done()
  File "apache_beam/runners/sdf_utils.py", line 115, in check_done
    return self._restriction_tracker.check_done()
  File "apache_beam/io/restriction_trackers.py", line 106, in check_done
    self._range.stop))
ValueError: OffsetRestrictionTracker is not done since work in range [0, 3) has not been claimed.

INFO:apache_beam.runners.portability.portable_runner:Job state changed to FAILED
.sssssINFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:35617
WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.24.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
[flink-runner-job-invoker] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - Log file environment variable 'log.file' is not set.
[flink-runner-job-invoker] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - JobManager log files are unavailable in the web dashboard. Log file location not found in environment variable 'log.file' or configuration key 'Key: 'web.log.path' , default: null (fallback keys: [{key=jobmanager.web.log.path, isDeprecated=true}])'.
[[5]{Create, Map(<lambda at fn_runner_test.py:490>), WindowInto(WindowIntoFn), Map(<lambda at fn_runner_test.py:493>)} (1/2)] WARN org.apache.flink.metrics.MetricGroup - The operator name [5]{Create, Map(<lambda at fn_runner_test.py:490>), WindowInto(WindowIntoFn), Map(<lambda at fn_runner_test.py:493>)} exceeded the 80 characters length limit and was truncated.
[[5]{Create, Map(<lambda at fn_runner_test.py:490>), WindowInto(WindowIntoFn), Map(<lambda at fn_runner_test.py:493>)} (2/2)] WARN org.apache.flink.metrics.MetricGroup - The operator name [5]{Create, Map(<lambda at fn_runner_test.py:490>), WindowInto(WindowIntoFn), Map(<lambda at fn_runner_test.py:493>)} exceeded the 80 characters length limit and was truncated.
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:44387.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:34221.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:41115
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 706, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1595960385.726259009","description":"Error received from peer ipv4:127.0.0.1:41115","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "apache_beam/runners/worker/data_plane.py", line 545, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "apache_beam/runners/worker/data_plane.py", line 528, in _read_inputs
    for elements in elements_iterator:
  File "<https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 413, in next
    return self._next()
  File "<https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 706, in _next
    raise self
_MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that terminated with:
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1595960385.726259009","description":"Error received from peer ipv4:127.0.0.1:41115","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight requests to complete
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state handlers.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:34499
WARNING:root:Make sure that locally built Python SDK docker image has Python 2.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python2.7_sdk:2.24.0.dev. If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.portability.portable_runner:Environment "LOOPBACK" has started a component necessary for the execution. Be sure to run the pipeline using
  with Pipeline() as p:
    p.apply(..)
This ensures that the pipeline finishes before this program exits.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to RUNNING
[flink-runner-job-invoker] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - Log file environment variable 'log.file' is not set.
[flink-runner-job-invoker] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - JobManager log files are unavailable in the web dashboard. Log file location not found in environment variable 'log.file' or configuration key 'Key: 'web.log.path' , default: null (fallback keys: [{key=jobmanager.web.log.path, isDeprecated=true}])'.
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel for localhost:42839.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with unbounded number of workers.
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for localhost:42421.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for localhost:40423
[assert_that/Group/GroupByKey -> [3]assert_that/{Group, Unkey, Match} (1/2)] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
INFO:apache_beam.runners.worker.sdk_worker:No more requests from control plane
INFO:apache_beam.runners.worker.sdk_worker:SDK Harness waiting for in-flight requests to complete
INFO:apache_beam.runners.worker.data_plane:Closing all cached grpc data channels.
INFO:apache_beam.runners.worker.sdk_worker:Closing all cached gRPC state handlers.
INFO:apache_beam.runners.worker.sdk_worker:Done consuming work.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
.INFO:__main__:removing conf dir: /tmp/flinktest-conf1y9U6c

----------------------------------------------------------------------
Ran 100 tests in 166.315s

OK (skipped=26)

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/python/test-suites/portable/common.gradle'> line: 57

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:flinkCompatibilityMatrixBatchLOOPBACK'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 18m 34s
108 actionable tasks: 79 executed, 28 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/xsgahijwgoimu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PreCommit_Python2_PVR_Flink_Cron #1192

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/1192/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PreCommit_Python2_PVR_Flink_Cron #1191

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/1191/display/redirect?page=changes>

Changes:

[Valentyn Tymofieiev] Build Py3.8 docker containers.

[Valentyn Tymofieiev] Correct docs tox task.

[Valentyn Tymofieiev] Correct the commit hash used in docs generation.

[Valentyn Tymofieiev] Exit gracefully when docs PR was created on previous RC.

[Valentyn Tymofieiev] Add Py38 Postcommits to the list of suites to trigger.

[Valentyn Tymofieiev] Removes unclear instruction.

[Kenneth Knowles] Disable checker plugin for Java 11 jobs due to

[simonepri] Improve beam CreateList test coverage


------------------------------------------
[...truncated 43.90 KB...]
> Task :buildSrc:jar
> Task :buildSrc:assemble

> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE

> Task :sdks:go:resolveBuildDependencies
Resolving golang.org/x/debug: commit='95515998a8a4bd7448134b2cb5971dbeb12e0b77', urls=[https://go.googlesource.com/debug]

> Task :sdks:java:extensions:join-library:compileJava FROM-CACHE
> Task :sdks:java:extensions:join-library:classes UP-TO-DATE

> Task :sdks:python:test-suites:portable:py2:setupVirtualenv
Collecting protobuf>=3.5.0.post1
  Using cached protobuf-3.12.2-cp27-cp27mu-manylinux1_x86_64.whl (1.3 MB)
Collecting importlib-resources>=1.0; python_version < "3.7"
  Using cached importlib_resources-3.0.0-py2.py3-none-any.whl (23 kB)
Collecting pathlib2<3,>=2.3.3; python_version < "3.4" and sys_platform != "win32"
  Using cached pathlib2-2.3.5-py2.py3-none-any.whl (18 kB)
Collecting distlib<1,>=0.3.1
  Using cached distlib-0.3.1-py2.py3-none-any.whl (335 kB)
Collecting appdirs<2,>=1.4.3
  Using cached appdirs-1.4.4-py2.py3-none-any.whl (9.6 kB)
Collecting importlib-metadata<2,>=0.12; python_version < "3.8"
  Using cached importlib_metadata-1.7.0-py2.py3-none-any.whl (31 kB)
Collecting enum34>=1.0.4; python_version < "3.4"
  Using cached enum34-1.1.10-py2-none-any.whl (11 kB)
Collecting futures>=2.2.0; python_version < "3.2"
  Using cached futures-3.3.0-py2-none-any.whl (16 kB)
Collecting zipp>=0.4; python_version < "3.8"
  Using cached zipp-1.2.0-py2.py3-none-any.whl (4.8 kB)
Collecting singledispatch; python_version < "3.4"
  Using cached singledispatch-3.4.0.3-py2.py3-none-any.whl (12 kB)
Collecting contextlib2; python_version < "3"
  Using cached contextlib2-0.6.0.post1-py2.py3-none-any.whl (9.8 kB)
Collecting typing; python_version < "3.5"
  Using cached typing-3.7.4.3-py2-none-any.whl (26 kB)
Processing /home/jenkins/.cache/pip/wheels/58/2c/26/52406f7d1f19bcc47a6fbd1037a5f293492f5cf1d58c539edb/scandir-1.10.0-cp27-cp27mu-linux_x86_64.whl
Collecting configparser>=3.5; python_version < "3"
  Using cached configparser-4.0.2-py2.py3-none-any.whl (22 kB)
Installing collected packages: toml, filelock, contextlib2, zipp, six, singledispatch, scandir, pathlib2, typing, importlib-resources, distlib, appdirs, configparser, importlib-metadata, virtualenv, pluggy, py, tox, enum34, futures, grpcio, protobuf, grpcio-tools, future, mypy-protobuf
Successfully installed appdirs-1.4.4 configparser-4.0.2 contextlib2-0.6.0.post1 distlib-0.3.1 enum34-1.1.10 filelock-3.0.12 future-0.18.2 futures-3.3.0 grpcio-1.30.0 grpcio-tools-1.30.0 importlib-metadata-1.7.0 importlib-resources-3.0.0 mypy-protobuf-1.18 pathlib2-2.3.5 pluggy-0.13.1 protobuf-3.12.2 py-1.9.0 scandir-1.10.0 singledispatch-3.4.0.3 six-1.15.0 toml-0.10.1 tox-3.11.1 typing-3.7.4.3 virtualenv-20.0.28 zipp-1.2.0

> Task :runners:local-java:compileJava FROM-CACHE
> Task :runners:local-java:classes UP-TO-DATE
> Task :sdks:java:extensions:join-library:jar
> Task :runners:local-java:jar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-java:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:core:jar
> Task :sdks:java:harness:jar
> Task :sdks:java:io:hadoop-common:compileJava FROM-CACHE
> Task :sdks:java:io:hadoop-common:classes UP-TO-DATE
> Task :sdks:java:io:mongodb:compileJava FROM-CACHE
> Task :sdks:java:io:hadoop-common:jar
> Task :sdks:java:io:mongodb:classes UP-TO-DATE
> Task :sdks:java:io:mongodb:jar
> Task :sdks:java:io:parquet:compileJava FROM-CACHE
> Task :sdks:java:io:parquet:classes UP-TO-DATE
> Task :sdks:java:io:parquet:jar

> Task :sdks:java:container:pullLicenses
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Configure project :sdks:java:container
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go:container
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go:examples
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:go:test
Found go 1.12 in /usr/bin/go, use it.

> Task :sdks:java:harness:shadowJar

> Task :sdks:go:resolveBuildDependencies
Resolving golang.org/x/net: commit='2fb46b16b8dda405028c50f7c7f0f9dd1fa6bfb1', urls=[https://go.googlesource.com/net]
Resolving golang.org/x/oauth2: commit='a032972e28060ca4f5644acffae3dfc268cc09db', urls=[https://go.googlesource.com/oauth2]
Resolving golang.org/x/sync: commit='fd80eb99c8f653c847d294a001bdf2a3a6f768f5', urls=[https://go.googlesource.com/sync]
Resolving golang.org/x/sys: commit='37707fdb30a5b38865cfb95e5aab41707daec7fd', urls=[https://go.googlesource.com/sys]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]

> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE

> Task :sdks:java:container:pullLicenses

> Configure project :sdks:python:container
Found go 1.12 in /usr/bin/go, use it.

> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:java-job-service:compileJava FROM-CACHE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:java-job-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :runners:flink:1.10:compileJava FROM-CACHE
> Task :runners:flink:1.10:classes
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :runners:flink:1.10:jar
> Task :runners:flink:1.10:job-server:compileJava NO-SOURCE
> Task :runners:flink:1.10:job-server:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:direct-java:shadowJar
> Task :sdks:java:extensions:sql:compileJava FROM-CACHE
> Task :sdks:java:extensions:sql:classes
> Task :sdks:java:extensions:sql:jar
> Task :sdks:java:extensions:sql:zetasql:compileJava FROM-CACHE
> Task :sdks:java:extensions:sql:zetasql:classes UP-TO-DATE
> Task :sdks:java:extensions:sql:zetasql:jar
> Task :sdks:java:extensions:sql:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:extensions:sql:expansion-service:classes UP-TO-DATE
> Task :sdks:java:extensions:sql:expansion-service:jar

> Task :sdks:java:container:pullLicenses

> Task :sdks:java:container:generateLicenseReport

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 1m 14s
1 actionable task: 1 executed

Publishing build scan...
https://gradle.com/s/cy62q33edoerw

Already using interpreter /usr/bin/python3
Using base prefix '/usr'
New python executable in <https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/java/container/build/virtualenv/bin/python3>
Also creating executable in <https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/java/container/build/virtualenv/bin/python>
Installing setuptools, pip, wheel...

> Task :sdks:java:extensions:sql:expansion-service:shadowJar

> Task :sdks:java:container:pullLicenses
done.
Collecting beautifulsoup4<5.0,>=4.9.0
  Using cached beautifulsoup4-4.9.1-py3-none-any.whl (115 kB)
Processing /home/jenkins/.cache/pip/wheels/c4/f0/ae/d4689c4532d1f111462ed6a884a7767d502e511ee65f0d8e1b/future-0.18.2-py3-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/57/d0/2c/e2003abb5bc1a94c2e8a6fe1c03b8055d074e34c13672e7eb7/PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting soupsieve>1.2
  Using cached soupsieve-2.0.1-py3-none-any.whl (32 kB)
Collecting six>=1.9.0
  Using cached six-1.15.0-py2.py3-none-any.whl (10 kB)
Installing collected packages: soupsieve, beautifulsoup4, future, pyyaml, six, tenacity
Successfully installed beautifulsoup4-4.9.1 future-0.18.2 pyyaml-5.3.1 six-1.15.0 soupsieve-2.0.1 tenacity-5.1.5
Executing <https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/java/container/build/virtualenv/bin/python> <https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py> --license_dir=<https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/java_third_party_licenses>        --dep_url_yaml=<https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> 
INFO:root:Pulling license for 174 dependencies using 16 threads.
INFO:root:pull_licenses_java.py succeed. It took 2.866799 seconds with 16 threads.
Copy licenses to <https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/java/container/build/target/third_party_licenses.>
Finished license_scripts.sh

> Task :sdks:java:container:copyDockerfileDependencies

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='386d4e5f4f92f86e6aec85985761bba4b938a2d5', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='7646b5360d049a7ca31e9133315db43456f39e2e', urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]

> Task :sdks:go:installDependencies
# github.com/apache/beam/sdks/go/pkg/beam/core/runtime/graphx
compile: writing output: write $WORK/b068/_pkg_.a: no space left on device
> Task :sdks:go:buildLinuxAmd64 FAILED
> Task :sdks:java:container:installDependencies
> Task :runners:flink:1.10:job-server:shadowJar

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:go:buildLinuxAmd64'.
> Build failed due to return code 2 of: 
  Command:
   /usr/bin/go build -o ./build/bin/integration github.com/apache/beam/sdks/go/test/integration
  Env:
   GOEXE=
   GOPATH=<https://ci-beam.apache.org/job/beam_PreCommit_Python2_PVR_Flink_Cron/ws/src/sdks/go/.gogradle/project_gopath>
   GOROOT=/usr/lib/go-1.12
   GOOS=linux
   GOARCH=amd64

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 8s
99 actionable tasks: 70 executed, 28 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/ye7p3bmh4gze4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org