You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/03/26 08:46:27 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #2076

See <https://builds.apache.org/job/beam_PostCommit_Python36/2076/display/redirect?page=changes>

Changes:

[github] Add notes to change log.

[github] [BEAM-9371] Add SideInputLoadTest to Java SDK (#10949)


------------------------------------------
[...truncated 10.63 MB...]
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 783, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1115, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1439, in create_par_do
    parameter)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1475, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

        java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:332)
        org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
        org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1358)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:153)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1081)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -8521: Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 313, in get
    processor = self.cached_bundle_processors[bundle_descriptor_id].pop()
IndexError: pop from empty list

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 283, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in _execute
    response = task()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 247, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 416, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 445, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 319, in get
    self.data_channel_factory)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 744, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 797, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 796, in <listcomp>
    (transform_id, get_operation(transform_id)) for transform_id in sorted(
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 783, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1115, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1439, in create_par_do
    parameter)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1475, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

        org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:160)
        org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-26_00_37_04-16079580558212917490 after 365 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
apache_beam.io.gcp.tests.pubsub_matcher: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output6963d810-edc0-400c-b19f-658169645632.
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_37_00-6827016893288422873?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_00_32-17957435058656002264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_09_53-8650438356110470574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_19_21-4815156533372977480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_27_29-18067823741883096838?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_37_04-8491941600098527558?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_51_11-67662364084758219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_00_06-7298547718739781496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_08_49-3434938534605566343?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_17_31-4375109551104437581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_26_35-2587404133405115583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_37_03-18146659383115363281?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_49_36-17289011573178669256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_58_04-16701000273517047435?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_06_46-7712046783070793931?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_15_18-11932495388815050289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_24_16-4089131568435352921?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_34_41-5761626420617674231?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_37_00-11394061618679583134?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_58_56-569361509654892507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_07_52-1255353150593675220?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_16_21-3043272951910620149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_26_18-8567640599077761422?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_37_01-5106400362029175043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_47_04-9762578751481349135?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_56_24-4831355488878020055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_05_31-827775955900629300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_14_04-7346685975606478099?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_22_51-10698249725191282145?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_33_09-11339101125786309742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_37_05-6389890454826454423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_45_54-2637981570561634556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_55_37-12102442876801717475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_04_51-3616805128624262102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_13_46-9341278113348370797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_21_42-18364166107921753314?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_29_25-7712305957921768384?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_37_38-2681870960261183141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_37_00-14275374606067467746?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_48_23-2287535133893153083?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_01_48-8569719354926766440?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_10_27-857531375316148382?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_27_30-5834887288683709143?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_37_04-16079580558212917490?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_00_51_56-12062417560754711660?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_02_05-15463125748071752188?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_11_54-15050058580164019581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_20_43-2616975892778840457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_01_29_44-6081396342287815136?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 59 tests in 4223.435s

FAILED (SKIP=9, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 38s
86 actionable tasks: 64 executed, 22 from cache

Publishing build scan...
https://gradle.com/s/bvhdg52k7y3i4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #2081

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/2081/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #2080

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/2080/display/redirect?page=changes>

Changes:

[robertwb] Add base SDK version to environment capabilities for Python and Java.

[robertwb] [BEAM-9614] Add SDK id for go.

[github] [BEAM-9495] Make DataCatalogTableProvider AutoCloseable (#11116)


------------------------------------------
Started by GitHub push by robertwb
Started by GitHub push by robertwb
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python36/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0ea3ec261a4d2813e763058861a6ea723fd7f533 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0ea3ec261a4d2813e763058861a6ea723fd7f533
Commit message: "[BEAM-9495] Make DataCatalogTableProvider AutoCloseable (#11116)"
 > git rev-list --no-walk 589a3037f08920601dc53a7ced9e175142f03b5c # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python36PostCommit
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Configure project :sdks:python:container
Found go 1.12 in /usr/bin/go, use it.

FAILURE: Build failed with an exception.

* What went wrong:
Could not determine the dependencies of task ':runners:flink:1.10:job-server:shadowJar'.
> Could not resolve all dependencies for configuration ':runners:flink:1.10:job-server:runtimeClasspath'.
   > Could not resolve io.grpc:grpc-api:[1.27.2].
     Required by:
         project :runners:flink:1.10:job-server > project :sdks:java:io:google-cloud-platform > io.grpc:grpc-all:1.27.2
         project :runners:flink:1.10:job-server > project :sdks:java:io:google-cloud-platform > io.grpc:grpc-auth:1.27.2
         project :runners:flink:1.10:job-server > project :sdks:java:io:google-cloud-platform > io.grpc:grpc-core:1.27.2
      > Failed to list versions for io.grpc:grpc-api.
         > Unable to load Maven meta-data from https://oss.sonatype.org/content/repositories/staging/io/grpc/grpc-api/maven-metadata.xml.
            > Could not HEAD 'https://oss.sonatype.org/content/repositories/staging/io/grpc/grpc-api/maven-metadata.xml'. Received status code 502 from server: Bad Gateway

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 19s

Publishing build scan...
https://gradle.com/s/4pamkv5iqffr4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #2079

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/2079/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-9340] Validate pipeline requirements in PipelineValidator.

[mxm] [BEAM-9566] Mitigate performance issue for output timestamp watermark


------------------------------------------
[...truncated 10.68 MB...]
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 783, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1115, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1439, in create_par_do
    parameter)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1475, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

        java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:332)
        org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
        org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1358)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:153)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1081)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -8512: Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 313, in get
    processor = self.cached_bundle_processors[bundle_descriptor_id].pop()
IndexError: pop from empty list

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 283, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in _execute
    response = task()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 247, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 416, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 445, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 319, in get
    self.data_channel_factory)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 744, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 797, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 796, in <listcomp>
    (transform_id, get_operation(transform_id)) for transform_id in sorted(
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 783, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1115, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1439, in create_par_do
    parameter)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1475, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

        org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:160)
        org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-26_11_12_47-11959085498207036348 after 364 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
apache_beam.io.gcp.tests.pubsub_matcher: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output9ee65167-385f-47d8-98d1-80fe7883cfe1.
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_12_46-16212710577986439312?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_26_42-10333671727134007446?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_36_27-11817657146794531035?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_45_05-8329736409242216882?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_53_41-16908016309138450625?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_02_32-10231556787453670414?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_12_44-10966227782284990128?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_33_50-11264104602426240651?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_43_39-11815680862056686750?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_53_54-8479047235642277602?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_02_49-11947369494158018026?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_12_47-4363077015874136965?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_25_08-498078249777750248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_33_13-4185182439088799936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_41_52-18140080355334342522?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_51_19-8601588429318160600?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_59_46-1898067512645819754?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_09_07-18342879837867559207?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_12_42-1282639198750776969?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_33_42-13658802983835440443?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_41_59-7643605824280750982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_50_43-1863249152072972047?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_00_56-7187587036425463697?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_12_43-16276873868405008531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_22_26-12183034311026328153?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_32_31-18024719890077761551?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_40_58-4680674163978906618?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_49_47-9733232299425970612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_58_52-3256975504232493273?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_07_34-9691799676420845873?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_12_43-14604814365773913283?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_20_53-14489185648517835675?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_30_44-15110724835193902583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_39_27-13162266357363964708?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_48_04-12320915079265381634?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_56_14-11805866090138868184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_05_16-11330589001917264081?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_12_43-12910563057280756038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_23_02-1352156565581016746?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_36_12-14831876858901765464?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_45_56-908842817362257619?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_02_42-454727683385758802?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_12_47-11959085498207036348?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_27_31-8805889660640761140?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_38_26-41949014135223509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_49_04-16222693521592134776?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_11_56_44-11764026375173761340?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_04_49-4857994465241808705?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_12_13_04-5275767207434533639?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 59 tests in 4135.915s

FAILED (SKIP=9, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 44s
86 actionable tasks: 69 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/wsbcclst5knsq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #2078

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/2078/display/redirect>

Changes:


------------------------------------------
[...truncated 10.69 MB...]
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 783, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1115, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1439, in create_par_do
    parameter)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1475, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

        java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:332)
        org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
        org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1358)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:153)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1081)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -8615: Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 313, in get
    processor = self.cached_bundle_processors[bundle_descriptor_id].pop()
IndexError: pop from empty list

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 283, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in _execute
    response = task()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 247, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 416, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 445, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 319, in get
    self.data_channel_factory)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 744, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 797, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 796, in <listcomp>
    (transform_id, get_operation(transform_id)) for transform_id in sorted(
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 783, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1115, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1439, in create_par_do
    parameter)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1475, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

        org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:160)
        org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-26_05_04_40-3511473735514937643 after 361 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
apache_beam.io.gcp.tests.pubsub_matcher: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_outputb37adff5-e762-4ff0-910c-f140c9e552bc.
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_04_40-1830862711870805915?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_18_52-2761752008312539304?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_27_21-4838025175394301045?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_35_54-5251696734464435641?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_44_06-4238522096801608892?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_53_38-4090427448431001514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_04_36-907005317476626685?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_26_02-13541036580418036833?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_34_49-15303209218246373202?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_43_17-10824882195430890575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_51_52-16465824773357739649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_04_39-18408932133786255628?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_16_39-7140491718600234974?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_24_58-15894368718709657608?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_34_11-5976958568327851335?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_43_17-2283374540400894707?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_51_29-7635557993246948972?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_59_51-5611781984504179226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_04_36-3033622510513157939?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_27_12-9787345847179466759?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_35_38-12529293201228962689?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_44_16-6489305520497069960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_52_38-10962451448881827549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_04_36-4305047535582353918?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_14_34-9813695421464566577?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_23_45-10490663859774537413?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_32_08-3255159385793800611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_41_32-7300810364470483302?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_50_28-15096341270033999486?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_59_27-3402507566175484552?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_04_35-8963007376923456938?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_13_25-13955202613692523102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_22_55-10985837456704883559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_31_39-12122274979532666980?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_41_14-5963436694114154646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_48_56-11270193420356744311?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_56_01-2098734078462374591?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_06_04_28-8343017532824247375?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_04_36-16585791121916179442?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_15_56-8423747877263078899?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_28_20-690730343585908841?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_37_24-13666622239517693007?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_54_28-13736085457543864635?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_04_40-3511473735514937643?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_19_30-9748008502319528843?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_30_03-4244717915727739178?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_39_56-10957751775413194177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_48_23-17165152721691951318?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_05_56_18-2103144531952918260?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 59 tests in 4112.065s

FAILED (SKIP=9, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 9m 49s
86 actionable tasks: 64 executed, 22 from cache

Publishing build scan...
https://gradle.com/s/oblmlewvvlar6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #2077

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/2077/display/redirect?page=changes>

Changes:

[github] [BEAM-7505] Add side input load test to Python SDK  (#11136)


------------------------------------------
[...truncated 10.68 MB...]
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 783, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1115, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1439, in create_par_do
    parameter)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1475, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

        java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
        java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
        org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
        org.apache.beam.runners.dataflow.worker.fn.control.RegisterAndProcessBundleOperation.finish(RegisterAndProcessBundleOperation.java:332)
        org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:85)
        org.apache.beam.runners.dataflow.worker.fn.control.BeamFnMapTaskExecutor.execute(BeamFnMapTaskExecutor.java:125)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1358)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:153)
        org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1081)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction -8336: Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 313, in get
    processor = self.cached_bundle_processors[bundle_descriptor_id].pop()
IndexError: pop from empty list

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 283, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 190, in _execute
    response = task()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 247, in <lambda>
    lambda: self.create_worker().do_instruction(request), request)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 416, in do_instruction
    getattr(request, request_type), request.instruction_id)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 445, in process_bundle
    instruction_id, request.process_bundle_descriptor_id)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py", line 319, in get
    self.data_channel_factory)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 744, in __init__
    self.ops = self.create_execution_tree(self.process_bundle_descriptor)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 797, in create_execution_tree
    descriptor.transforms, key=topological_height, reverse=True)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 796, in <listcomp>
    (transform_id, get_operation(transform_id)) for transform_id in sorted(
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in get_operation
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 780, in <dictcomp>
    pcoll_id in descriptor.transforms[transform_id].outputs.items()
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 778, in <listcomp>
    tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 705, in wrapper
    result = cache[args] = func(*args)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 783, in get_operation
    transform_id, transform_consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1115, in create_operation
    return creator(self, transform_id, transform_proto, payload, consumers)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1439, in create_par_do
    parameter)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/bundle_processor.py", line 1475, in _create_pardo_operation
    dofn_data = pickler.loads(serialized_fn)
  File "/usr/local/lib/python3.6/site-packages/apache_beam/internal/pickler.py", line 287, in loads
    return dill.loads(s)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python3.6/site-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "stringsource", line 17, in apache_beam.utils.windowed_value._IntervalWindowBase.__setstate_cython__
TypeError: Expected tuple, got dict

        org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:160)
        org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:251)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailableInternal(ServerCallImpl.java:309)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:292)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:782)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
        org.apache.beam.vendor.grpc.v1p26p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        java.lang.Thread.run(Thread.java:748)
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-26_03_36_53-15418282026879900417 after 365 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
apache_beam.io.gcp.tests.pubsub_matcher: ERROR: Timeout after 400 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/wc_subscription_output013e87ee-9e83-4c5f-a842-c9e13f854cad.
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_36_52-9549145595772205066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_51_03-11363087091557433159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_00_37-13662123048790093438?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_09_11-15773890315606006052?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_17_45-1021576242032208038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_25_45-10452366559277837743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_34_09-17557564107284332636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_36_48-9442720042009169546?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_02_07-14262793624414283778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_11_20-11454404172601203516?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_28_24-13555188822506386399?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_36_52-4585875286961751722?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_48_44-5701263857888289438?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_57_27-3210063820844220327?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_06_45-1774896560757776716?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_16_32-8209012957595081833?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_24_58-4777438312409617102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_36_49-1816096313703764855?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_46_16-1647748098862462749?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_55_51-1251480323514759421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_04_43-15662103256206373161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_13_16-2725932401474146253?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_20_57-4152726744740924878?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_28_33-7375417985602540017?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_37_31-13889274278905132988?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_36_48-10563755923769453905?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_57_13-8985945934480801247?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_05_44-3502968497321783061?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_14_26-1216098671961288654?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_22_48-1018974845099935196?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_31_46-18123259357597710456?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_36_48-6119538905545488009?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_45_54-3595799981326008007?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_55_34-14499335611931526063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_05_07-2619311610518391825?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_14_50-18428055109594243025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_24_47-8336897829971475946?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_36_48-17533306281166569782?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_47_15-6990947996316828676?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_00_25-15182145148804408706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_09_22-6550788650343738033?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_19_16-7927728098506609200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_27_28-3630662911474190403?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_36_53-15418282026879900417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_03_51_49-3263473212359173073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_03_37-14954166510850657540?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_12_45-18236713343184076866?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_21_03-12929962624442617683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-26_04_29_12-4317119078797951630?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 59 tests in 4167.437s

FAILED (SKIP=9, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 0s
86 actionable tasks: 64 executed, 22 from cache

Publishing build scan...
https://gradle.com/s/ew4d3ytpxlvlc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org