You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/05/10 04:50:36 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #2177

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/2177/display/redirect?page=changes>

Changes:

[tgroh] Mark PValue and PValueBase Internal

------------------------------------------
[...truncated 1.76 MB...]
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s14", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 1120 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2017-05-10T04:43:52.075472Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2017-05-09_21_43_51-2632386391657042642'
 location: u'global'
 name: u'beamapp-jenkins-0510044337-649531'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2017-05-09_21_43_51-2632386391657042642]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2017-05-09_21_43_51-2632386391657042642
root: INFO: Job 2017-05-09_21_43_51-2632386391657042642 is in state JOB_STATE_RUNNING
root: INFO: 2017-05-10T04:43:51.632Z: JOB_MESSAGE_WARNING: (24881d79ea4a53be): Setting the number of workers (1) disables autoscaling for this job. If you are trying to cap autoscaling, consider only setting max_num_workers. If you want to disable autoscaling altogether, the documented way is to explicitly use autoscalingAlgorithm=NONE.
root: INFO: 2017-05-10T04:43:52.695Z: JOB_MESSAGE_DETAILED: (cf18b6e5c174977b): Checking required Cloud APIs are enabled.
root: INFO: 2017-05-10T04:43:54.885Z: JOB_MESSAGE_DEBUG: (cf18b6e5c174915c): Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2017-05-10T04:43:54.888Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749b66): Expanding GroupByKey operations into optimizable parts.
root: INFO: 2017-05-10T04:43:54.890Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749570): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2017-05-10T04:43:54.895Z: JOB_MESSAGE_DEBUG: (cf18b6e5c1749984): Annotating graph with Autotuner information.
root: INFO: 2017-05-10T04:43:54.908Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749d98): Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2017-05-10T04:43:54.917Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749bb6): Unzipping flatten s10 for input s8.out
root: INFO: 2017-05-10T04:43:54.920Z: JOB_MESSAGE_DETAILED: (cf18b6e5c17495c0): Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten , into producer assert_that/Group/pair_with_0
root: INFO: 2017-05-10T04:43:54.923Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749fca): Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2017-05-10T04:43:54.925Z: JOB_MESSAGE_DETAILED: (cf18b6e5c17499d4): Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2017-05-10T04:43:54.930Z: JOB_MESSAGE_DETAILED: (cf18b6e5c17493de): Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2017-05-10T04:43:54.937Z: JOB_MESSAGE_DETAILED: (cf18b6e5c17497f2): Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2017-05-10T04:43:54.945Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749610): Unzipping flatten s10-u13 for input s11-reify-value0-c11
root: INFO: 2017-05-10T04:43:54.949Z: JOB_MESSAGE_DETAILED: (cf18b6e5c174901a): Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten , into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2017-05-10T04:43:54.951Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749a24): Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2017-05-10T04:43:54.953Z: JOB_MESSAGE_DETAILED: (cf18b6e5c174942e): Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2017-05-10T04:43:54.957Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749e38): Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2017-05-10T04:43:54.959Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749842): Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/Do
root: INFO: 2017-05-10T04:43:54.961Z: JOB_MESSAGE_DETAILED: (cf18b6e5c174924c): Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2017-05-10T04:43:54.964Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749c56): Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2017-05-10T04:43:54.966Z: JOB_MESSAGE_DETAILED: (cf18b6e5c1749660): Fusing consumer compute/Do into start/Read
root: INFO: 2017-05-10T04:43:55.026Z: JOB_MESSAGE_DEBUG: (cf18b6e5c1749750): Workflow config is missing a default resource spec.
root: INFO: 2017-05-10T04:43:55.028Z: JOB_MESSAGE_DETAILED: (cf18b6e5c174915a): Adding StepResource setup and teardown to workflow graph.
root: INFO: 2017-05-10T04:43:55.031Z: JOB_MESSAGE_DEBUG: (cf18b6e5c1749b64): Adding workflow start and stop steps.
root: INFO: 2017-05-10T04:43:55.033Z: JOB_MESSAGE_DEBUG: (cf18b6e5c174956e): Assigning stage ids.
root: INFO: 2017-05-10T04:43:55.073Z: JOB_MESSAGE_DEBUG: (d010e88f11ba2266): Executing wait step start21
root: INFO: 2017-05-10T04:43:55.081Z: JOB_MESSAGE_BASIC: (d010e88f11ba2e8d): Executing operation side/Read
root: INFO: 2017-05-10T04:43:55.092Z: JOB_MESSAGE_DEBUG: (d010e88f11ba29ef): Value "side/Read.out" materialized.
root: INFO: 2017-05-10T04:43:55.101Z: JOB_MESSAGE_BASIC: (d010e88f11ba2616): Executing operation compute/SideInput-s3
root: INFO: 2017-05-10T04:43:55.111Z: JOB_MESSAGE_DEBUG: (d010e88f11ba2178): Value "compute/SideInput-s3.output" materialized.
root: INFO: 2017-05-10T04:43:55.117Z: JOB_MESSAGE_BASIC: (d010e88f11ba2cda): Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2017-05-10T04:43:55.323Z: JOB_MESSAGE_DEBUG: (51e28024bc9536aa): Starting worker pool setup.
root: INFO: 2017-05-10T04:43:55.325Z: JOB_MESSAGE_BASIC: (51e28024bc95309c): Starting 1 workers...
root: INFO: 2017-05-10T04:43:55.340Z: JOB_MESSAGE_DEBUG: (d010e88f11ba25ed): Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2017-05-10T04:43:55.351Z: JOB_MESSAGE_BASIC: (d010e88f11ba2fc5): Executing operation start/Read+compute/Do+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2017-05-10T04:43:55.361Z: JOB_MESSAGE_BASIC: (9aa9701bfc6ae545): Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2017-05-10T04:45:02.352Z: JOB_MESSAGE_DETAILED: (32554706c3c5c9c1): Workers have started successfully.
root: INFO: 2017-05-10T04:47:46.860Z: JOB_MESSAGE_ERROR: (8343e7f25c211a5d): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:47:50.231Z: JOB_MESSAGE_ERROR: (8343e7f25c211ed0): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:47:53.622Z: JOB_MESSAGE_ERROR: (8343e7f25c211343): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:47:56.962Z: JOB_MESSAGE_ERROR: (8343e7f25c211913): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:48:00.321Z: JOB_MESSAGE_ERROR: (8343e7f25c211ee3): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:48:03.693Z: JOB_MESSAGE_ERROR: (8343e7f25c2114b3): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:48:07.082Z: JOB_MESSAGE_ERROR: (8343e7f25c211926): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:48:10.512Z: JOB_MESSAGE_ERROR: (8343e7f25c211d99): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:48:13.934Z: JOB_MESSAGE_ERROR: (8343e7f25c21120c): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 581, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 166, in execute
    op.start()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/native_operations.py", line 49, in start
    self._current_progress = get_progress()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/workercustomsources.py", line 126, in get_progress
    return iobase.ReaderProgress(
AttributeError: 'module' object has no attribute 'ReaderProgress'

root: INFO: 2017-05-10T04:48:14.081Z: JOB_MESSAGE_DEBUG: (d010e88f11ba274e): Executing failure step failure20
root: INFO: 2017-05-10T04:48:14.083Z: JOB_MESSAGE_ERROR: (d010e88f11ba25c4): Workflow failed. Causes: (d010e88f11ba2a62): S05:start/Read+compute/Do+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed.
root: INFO: 2017-05-10T04:48:14.140Z: JOB_MESSAGE_DETAILED: (cf18b6e5c17494a5): Cleaning up.
root: INFO: 2017-05-10T04:48:14.270Z: JOB_MESSAGE_DEBUG: (cf18b6e5c1749eaf): Starting worker pool teardown.
root: INFO: 2017-05-10T04:48:14.272Z: JOB_MESSAGE_BASIC: (cf18b6e5c17498b9): Stopping worker pool...
root: INFO: 2017-05-10T04:49:39.278Z: JOB_MESSAGE_BASIC: (cf18b6e5c1749fc8): Worker pool stopped.
root: INFO: 2017-05-10T04:49:39.325Z: JOB_MESSAGE_DEBUG: (cf18b6e5c17494f5): Tearing down pending resources...
root: INFO: Job 2017-05-09_21_43_51-2632386391657042642 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 15 tests in 1444.998s

FAILED (errors=14)
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_26_43-7494046603359720390?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_33_05-17778654726138252063?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_39_35-1019057384112298327?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_26_43-2743303429761940666?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_32_15-9014687942684870733?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_38_26-5769702013089870763?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_43_51-2632386391657042642?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_26_43-8830841609316196593?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_32_06-878169968002913237?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_38_22-13458795397275563733?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_45_16-14516527290768753107?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_26_41-18175790497212241534?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_32_43-10126553643447272512?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_38_08-2485544320882357778?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-09_21_43_29-12045478608446743663?project=apache-beam-testing
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user klk@google.com
Not sending mail to unregistered user kirpichov@google.com
Not sending mail to unregistered user ccy@google.com

Jenkins build is back to normal : beam_PostCommit_Python_Verify #2179

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/2179/display/redirect?page=changes>


Build failed in Jenkins: beam_PostCommit_Python_Verify #2178

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/2178/display/redirect?page=changes>

Changes:

[chamikara] Mark PipelineVisitor and AppliedPTransform as internal.

------------------------------------------
[...truncated 583.17 KB...]
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_merge_tagged_vals_under_key"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key).out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s11"
        }, 
        "serialized_fn": "<string of 1344 bytes>", 
        "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key)"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s13", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s12"
        }, 
        "serialized_fn": "<string of 964 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s14", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 1116 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 10 May 2017 07:22:30 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(fb22bb4935c0dc05): The workflow could not be created. Causes: (6df4931edcda8554): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 10 May 2017 07:22:32 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(977b16180b57a7b6): The workflow could not be created. Causes: (e74ec17bc93ce3ec): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '440', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 10 May 2017 07:22:37 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(b9324ad1166a5c08): The workflow could not be created. Causes: (2f56322beac9b6e): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 10 May 2017 07:22:48 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(6558385183e10a00): The workflow could not be created. Causes: (402d72dae45e3644): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: ERROR: HTTP status 429 trying to create job at dataflow service endpoint https://dataflow.googleapis.com
root: CRITICAL: details of server error: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 10 May 2017 07:23:07 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(f61fa0d40c324813): The workflow could not be created. Causes: (85181ea36db4428f): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 15 tests in 1482.331s

FAILED (errors=1)
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_09_51-14322785757164930721?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_16_08-14527735254560773830?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_22_20-16513378811748848808?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_27_49-11934288747770486948?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_09_48-12583129661156031828?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_16_25-5994599331681716108?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_22_15-9143139617157447675?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_28_21-7513777183059118878?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_09_48-12909723208104283468?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_17_03-8512959352195195629?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_23_13-10107273846410910154?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_09_48-8292473655447987501?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_16_44-5863574620033422371?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/job/2017-05-10_00_23_25-3291124431490556190?project=apache-beam-testing
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user klk@google.com
Not sending mail to unregistered user kirpichov@google.com
Not sending mail to unregistered user ccy@google.com