You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/04/01 03:15:12 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #1700

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1700/display/redirect>

------------------------------------------
[...truncated 665.52 KB...]
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 192, in run_transform
    return m(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 378, in run_ParDo
    si_labels[side_pval] = self._cache.get_pvalue(side_pval).step_name
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 266, in get_pvalue
    self._ensure_pvalue_has_real_producer(pvalue)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 239, in _ensure_pvalue_has_real_producer
    real_producer = pvalue.producer
AttributeError: 'AsList' object has no attribute 'producer'
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by main input (tag None): refcount: 1 => 0
root: ERROR: Error while visiting Map(<lambda at sideinputs_test.py:259>)
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_as_singleton_with_different_defaults (apache_beam.transforms.sideinputs_test.SideInputsTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 248, in test_as_singleton_with_different_defaults
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/test_pipeline.py",> line 91, in run
    result = super(TestPipeline, self).run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 169, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 32, in run
    self.result = super(TestDataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 161, in run
    super(DataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 122, in run
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 192, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 471, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 474, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 117, in visit_transform
    self.runner.run_transform(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 192, in run_transform
    return m(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 378, in run_ParDo
    si_labels[side_pval] = self._cache.get_pvalue(side_pval).step_name
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 266, in get_pvalue
    self._ensure_pvalue_has_real_producer(pvalue)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 239, in _ensure_pvalue_has_real_producer
    real_producer = pvalue.producer
AttributeError: 'AsSingleton' object has no attribute 'producer'
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by main input (tag None): refcount: 1 => 0
root: ERROR: Error while visiting Map(<lambda at sideinputs_test.py:235>)
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_as_singleton_without_unique_labels (apache_beam.transforms.sideinputs_test.SideInputsTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 226, in test_as_singleton_without_unique_labels
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/test_pipeline.py",> line 91, in run
    result = super(TestPipeline, self).run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 169, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 32, in run
    self.result = super(TestDataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 161, in run
    super(DataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 122, in run
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 192, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 471, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 474, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 117, in visit_transform
    self.runner.run_transform(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 192, in run_transform
    return m(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 378, in run_ParDo
    si_labels[side_pval] = self._cache.get_pvalue(side_pval).step_name
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 266, in get_pvalue
    self._ensure_pvalue_has_real_producer(pvalue)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 239, in _ensure_pvalue_has_real_producer
    real_producer = pvalue.producer
AttributeError: 'AsSingleton' object has no attribute 'producer'
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by main input (tag None): refcount: 1 => 0
root: ERROR: Error while visiting Map(<lambda at sideinputs_test.py:214>)
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 168, in test_default_value_singleton_side_input
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/test_pipeline.py",> line 91, in run
    result = super(TestPipeline, self).run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 169, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 32, in run
    self.result = super(TestDataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 161, in run
    super(DataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 122, in run
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 192, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 471, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 474, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 117, in visit_transform
    self.runner.run_transform(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 192, in run_transform
    return m(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 378, in run_ParDo
    si_labels[side_pval] = self._cache.get_pvalue(side_pval).step_name
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 266, in get_pvalue
    self._ensure_pvalue_has_real_producer(pvalue)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 239, in _ensure_pvalue_has_real_producer
    real_producer = pvalue.producer
AttributeError: 'AsSingleton' object has no attribute 'producer'
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by start (tag None): refcount: 1 => 0
root: ERROR: Error while visiting FlatMap(<lambda at sideinputs_test.py:166>)
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 146, in test_empty_singleton_side_input
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/test_pipeline.py",> line 91, in run
    result = super(TestPipeline, self).run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 169, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 32, in run
    self.result = super(TestDataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 161, in run
    super(DataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 122, in run
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 192, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 471, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 474, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 117, in visit_transform
    self.runner.run_transform(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 192, in run_transform
    return m(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 378, in run_ParDo
    si_labels[side_pval] = self._cache.get_pvalue(side_pval).step_name
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 266, in get_pvalue
    self._ensure_pvalue_has_real_producer(pvalue)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 239, in _ensure_pvalue_has_real_producer
    real_producer = pvalue.producer
AttributeError: 'AsSingleton' object has no attribute 'producer'
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by start (tag None): refcount: 1 => 0
root: ERROR: Error while visiting compute
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_flattened_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 307, in test_flattened_side_input
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/test_pipeline.py",> line 91, in run
    result = super(TestPipeline, self).run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 169, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 32, in run
    self.result = super(TestDataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 161, in run
    super(DataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 122, in run
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 192, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 471, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 474, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 117, in visit_transform
    self.runner.run_transform(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 192, in run_transform
    return m(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 378, in run_ParDo
    si_labels[side_pval] = self._cache.get_pvalue(side_pval).step_name
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 266, in get_pvalue
    self._ensure_pvalue_has_real_producer(pvalue)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 239, in _ensure_pvalue_has_real_producer
    real_producer = pvalue.producer
AttributeError: 'AsList' object has no attribute 'producer'
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by side1 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by side2 (tag None): refcount: 1 => 0
root: DEBUG: PValue computed by main input (tag None): refcount: 1 => 0
root: ERROR: Error while visiting FlatMap(<lambda at sideinputs_test.py:303>)
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 179, in test_iterable_side_input
    pipeline.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/test_pipeline.py",> line 91, in run
    result = super(TestPipeline, self).run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 169, in run
    return self.runner.run(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 32, in run
    self.result = super(TestDataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 161, in run
    super(DataflowRunner, self).run(pipeline)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 122, in run
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 192, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 471, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/pipeline.py",> line 474, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 117, in visit_transform
    self.runner.run_transform(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 192, in run_transform
    return m(transform_node)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 378, in run_ParDo
    si_labels[side_pval] = self._cache.get_pvalue(side_pval).step_name
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 266, in get_pvalue
    self._ensure_pvalue_has_real_producer(pvalue)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/sdks/python/apache_beam/runners/runner.py",> line 239, in _ensure_pvalue_has_real_producer
    real_producer = pvalue.producer
AttributeError: 'AsIter' object has no attribute 'producer'
-------------------- >> begin captured logging << --------------------
root: DEBUG: PValue computed by start (tag None): refcount: 1 => 0
root: ERROR: Error while visiting compute
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 13 tests in 362.239s

FAILED (errors=9)
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PostCommit_Python_Verify #1702

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1702/display/redirect>


Build failed in Jenkins: beam_PostCommit_Python_Verify #1701

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1701/display/redirect?page=changes>

Changes:

[robertwb] Only encode PCollection outputs in Runner API protos.

[robertwb] Ensure transforms are picklable before materializing to protos.

[robertwb] Fix side inputs on dataflow runner.

------------------------------------------
[...truncated 876.35 KB...]
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Group/GroupByKey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s12"
        }, 
        "serialized_fn": "<string of 232 bytes>", 
        "user_name": "assert_that/Group/GroupByKey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s14", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_merge_tagged_vals_under_key"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Group/Map(_merge_tagged_vals_under_key).out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 1332 bytes>", 
        "user_name": ""
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s15", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s14"
        }, 
        "serialized_fn": "<string of 956 bytes>", 
        "user_name": ""
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s16", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s15"
        }, 
        "serialized_fn": "<string of 1112 bytes>", 
        "user_name": ""
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sat, 01 Apr 2017 05:07:54 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(c5d8d2f59aa86ef6): The workflow could not be created. Causes: (e4e4b924d8b00fb4): Too many running jobs. Project apache-beam-testing is running 28 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '440', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sat, 01 Apr 2017 05:07:56 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(7e7b44fd201d43f): The workflow could not be created. Causes: (33d757580c19b874): Too many running jobs. Project apache-beam-testing is running 28 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sat, 01 Apr 2017 05:08:01 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(67ed1cd3858cf17d): The workflow could not be created. Causes: (a585434bd06ace48): Too many running jobs. Project apache-beam-testing is running 28 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: DEBUG: Response returned status 429, retrying
root: DEBUG: Retrying request to url https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json after exception HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '441', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sat, 01 Apr 2017 05:08:12 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(f35652d4c1f687ed): The workflow could not be created. Causes: (b04785d7ff72ca31): Too many running jobs. Project apache-beam-testing is running 27 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
root: ERROR: HTTP status 429 trying to create job at dataflow service endpoint https://dataflow.googleapis.com
root: CRITICAL: details of server error: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'status': '429', 'content-length': '440', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sat, 01 Apr 2017 05:08:29 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
  "error": {
    "code": 429,
    "message": "(761af1bdd7a5308): The workflow could not be created. Causes: (501089b12b391d04): Too many running jobs. Project apache-beam-testing is running 25 jobs and project limit for active jobs is 25. To fix this, cancel an existing workflow via the UI, wait for a workflow to finish or contact dataflow-feedback@google.com to request an increase in quota.",
    "status": "RESOURCE_EXHAUSTED"
  }
}
>
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 13 tests in 1016.622s

FAILED (errors=1)
Build step 'Execute shell' marked build as failure