You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/03/27 19:05:57 UTC

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1196

See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1196/display/redirect?page=changes>

Changes:

[dawid] [BEAM-2831] Do not wrap IOException in SerializableCoder

------------------------------------------
[...truncated 780.64 KB...]
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T18:56:06.741465Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_11_56_05-5490453045814037913'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327185556-630164'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_11_56_05-5490453045814037913]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_56_05-5490453045814037913?project=apache-beam-testing
root: INFO: Job 2018-03-27_11_56_05-5490453045814037913 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T18:56:05.807Z: JOB_MESSAGE_WARNING: Job 2018-03-27_11_56_05-5490453045814037913 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T18:56:05.844Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_11_56_05-5490453045814037913. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T18:56:05.873Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_11_56_05-5490453045814037913.
root: INFO: 2018-03-27T18:56:08.331Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T18:56:08.630Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T18:56:09.538Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T18:56:09.578Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T18:56:09.607Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T18:56:09.633Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T18:56:09.647Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T18:56:09.687Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T18:56:09.715Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-27T18:56:09.745Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-27T18:56:09.773Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T18:56:09.806Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T18:56:09.840Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T18:56:09.866Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T18:56:09.898Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-27T18:56:09.924Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T18:56:09.957Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T18:56:09.990Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T18:56:10.016Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T18:56:10.037Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-27T18:56:10.071Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T18:56:10.104Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-27T18:56:10.127Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T18:56:10.154Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-27T18:56:10.185Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T18:56:10.222Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T18:56:10.243Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T18:56:10.279Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T18:56:10.314Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T18:56:10.454Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-27T18:56:10.518Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-27T18:56:10.549Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T18:56:10.561Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T18:56:10.594Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-27T18:56:10.695Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-27T18:56:10.754Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-27_11_56_05-5490453045814037913 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T18:56:20.304Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T18:56:36.922Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T18:56:57.132Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T19:01:48.657Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T19:01:48.746Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T19:01:48.876Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T19:01:48.957Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T19:01:54.559Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T19:01:57.932Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T19:02:01.344Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T19:02:04.718Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T19:02:04.771Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-27T19:02:04.807Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032718555-03271156-0ad2-harness-7x7m,
  beamapp-jenkins-032718555-03271156-0ad2-harness-7x7m,
  beamapp-jenkins-032718555-03271156-0ad2-harness-7x7m,
  beamapp-jenkins-032718555-03271156-0ad2-harness-7x7m
root: INFO: 2018-03-27T19:02:04.935Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T19:02:04.994Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T19:02:05.043Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T19:03:25.077Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T19:03:25.119Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-27T19:03:25.164Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-27_11_56_05-5490453045814037913 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 2022.987s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_32_03-16150791362815293717?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_39_24-3883066675766862274?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_46_10-12596416067968885824?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_57_51-3317164318033171820?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_32_03-13075561341181641056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_39_43-5093947020766332411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_47_40-1498724586845556160?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_54_41-4436569945421858515?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_32_03-16283634229691891067?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_39_44-12252465556747743794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_47_19-1033570348734285805?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_56_05-5490453045814037913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_32_04-7598690882008751707?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_39_20-1312637914002205353?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_46_30-8970544101708334669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_53_36-11618455753502044909?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user dawid@getindata.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Jenkins build is back to normal : beam_PostCommit_Python_ValidatesRunner_Dataflow #1200

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1200/display/redirect?page=changes>


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1199

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1199/display/redirect?page=changes>

Changes:

[tgroh] Cleanups in GroupAlsoByWindowEvaluatorFactory

[tgroh] Allow Fusion to Continue with unknown PTransforms

[tgroh] fixup! Allow Fusion to Continue with unknown PTransforms

[tgroh] fixup! fixup! Allow Fusion to Continue with unknown PTransforms

[chamikara] [BEAM-3744] Expand Pubsub read API for Python. (#4901)

------------------------------------------
[...truncated 729.03 KB...]
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 980 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s15", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s14"
        }, 
        "serialized_fn": "<string of 1172 bytes>", 
        "user_name": "assert_that/Match"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s16", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T23:09:59.999477Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_16_09_58-12626702455421293246'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327230949-492200'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_16_09_58-12626702455421293246]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_09_58-12626702455421293246?project=apache-beam-testing
root: INFO: Job 2018-03-27_16_09_58-12626702455421293246 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T23:09:58.939Z: JOB_MESSAGE_WARNING: Job 2018-03-27_16_09_58-12626702455421293246 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T23:09:58.969Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_16_09_58-12626702455421293246. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T23:09:58.996Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_16_09_58-12626702455421293246.
root: INFO: 2018-03-27T23:10:01.860Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T23:10:01.969Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T23:10:03.136Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/1421 instances, 1/46 CPUs, 250/150 disk GB, 0/1998 SSD disk GB, 1/66 instance groups, 1/16 managed instance groups, 1/40 instance templates, 1/273 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1545.509s

FAILED (errors=6, failures=3)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_15_53_43-11482671923852177572?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_01_01-4322332307887274284?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_08_47-11830690090467400710?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_09_03-7990827244325842519?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_10_33-15883073858780408719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_15_53_38-7423982132368121081?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_01_55-2442305553573735477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_09_43-9919515862997320778?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_09_58-12626702455421293246?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_10_14-4988486362262838552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_15_53_39-10837416903996163751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_01_01-602443222349780264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_11_07-17436675794668597233?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_15_53_38-6113242011989029334?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_00_55-13454450178281775119?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_07_32-1095996205026448340?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user dawid@getindata.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1198

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1198/display/redirect>

------------------------------------------
[...truncated 979.90 KB...]
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "concatenate/MapToVoidKey1.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s3"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "concatenate/MapToVoidKey1"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T21:12:43.344355Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_14_12_42-8666691247344616596'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327211232-279365'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_14_12_42-8666691247344616596]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_12_42-8666691247344616596?project=apache-beam-testing
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T21:12:42.408Z: JOB_MESSAGE_WARNING: Job 2018-03-27_14_12_42-8666691247344616596 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T21:12:42.442Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_14_12_42-8666691247344616596. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T21:12:42.479Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_14_12_42-8666691247344616596.
root: INFO: 2018-03-27T21:12:45.703Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T21:12:45.823Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T21:12:46.601Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T21:12:46.645Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T21:12:46.685Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T21:12:46.777Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T21:12:46.811Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T21:12:46.901Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T21:12:46.925Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/MapToVoidKey1 into side pairs/Read
root: INFO: 2018-03-27T21:12:46.960Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/MapToVoidKey1 into side pairs/Read
root: INFO: 2018-03-27T21:12:47.037Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/MapToVoidKey0 into side list/Read
root: INFO: 2018-03-27T21:12:47.095Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/MapToVoidKey0 into side list/Read
root: INFO: 2018-03-27T21:12:47.190Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T21:12:47.225Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T21:12:47.304Z: JOB_MESSAGE_DETAILED: Unzipping flatten s14 for input s12.out
root: INFO: 2018-03-27T21:12:47.343Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2018-03-27T21:12:47.411Z: JOB_MESSAGE_DETAILED: Unzipping flatten s14-u13 for input s15-reify-value0-c11
root: INFO: 2018-03-27T21:12:47.445Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s14-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T21:12:47.484Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T21:12:47.568Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T21:12:47.608Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T21:12:47.693Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2018-03-27T21:12:47.766Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T21:12:47.794Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/concatenate into main input/Read
root: INFO: 2018-03-27T21:12:47.833Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into concatenate/concatenate
root: INFO: 2018-03-27T21:12:47.904Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T21:12:47.935Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T21:12:48.016Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T21:12:48.047Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T21:12:48.076Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T21:12:48.150Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T21:12:48.363Z: JOB_MESSAGE_DEBUG: Executing wait step start23
root: INFO: 2018-03-27T21:12:48.436Z: JOB_MESSAGE_BASIC: Executing operation side pairs/Read+concatenate/MapToVoidKey1+concatenate/MapToVoidKey1
root: INFO: 2018-03-27T21:12:48.523Z: JOB_MESSAGE_BASIC: Executing operation side list/Read+concatenate/MapToVoidKey0+concatenate/MapToVoidKey0
root: INFO: 2018-03-27T21:12:48.536Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T21:12:48.559Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T21:12:48.562Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-27T21:12:48.745Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T21:12:48.806Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T21:12:58.494Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T21:13:14.356Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T21:15:21.834Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T21:18:38.828Z: JOB_MESSAGE_DEBUG: Value "concatenate/MapToVoidKey1.out" materialized.
root: INFO: 2018-03-27T21:18:38.911Z: JOB_MESSAGE_BASIC: Executing operation concatenate/_DataflowIterableSideInput(MapToVoidKey1.out.0)
root: INFO: 2018-03-27T21:18:39.035Z: JOB_MESSAGE_DEBUG: Value "concatenate/_DataflowIterableSideInput(MapToVoidKey1.out.0).output" materialized.
root: INFO: 2018-03-27T21:18:57.854Z: JOB_MESSAGE_DEBUG: Value "concatenate/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T21:18:57.930Z: JOB_MESSAGE_BASIC: Executing operation concatenate/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T21:18:58.046Z: JOB_MESSAGE_DEBUG: Value "concatenate/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T21:18:58.123Z: JOB_MESSAGE_BASIC: Executing operation main input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T21:19:02.856Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T21:19:06.362Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T21:19:08.751Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T21:19:09.173Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T21:19:09.223Z: JOB_MESSAGE_DEBUG: Executing failure step failure22
root: INFO: 2018-03-27T21:19:09.257Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S07:main input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032721123-03271412-8153-harness-58mk,
  beamapp-jenkins-032721123-03271412-8153-harness-58mk,
  beamapp-jenkins-032721123-03271412-8153-harness-58mk,
  beamapp-jenkins-032721123-03271412-8153-harness-58mk
root: INFO: 2018-03-27T21:19:09.382Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T21:19:09.435Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T21:19:09.471Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T21:20:47.349Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T21:20:47.445Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 929.102s

FAILED (errors=12)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-6766723166623993968?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_31-10929177711809778955?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_15_16-16354086526836046220?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-1201976823224416842?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_07_27-7271190073261436645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_09_20-12600433319759080049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_10_50-798850488600387760?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_12_42-8666691247344616596?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-1765600968155479915?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_26-10060460114915233142?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_14_51-11698478401267540067?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_16_36-2122733496028554464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-16039646830469645721?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_00-6879600078595900695?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_14_46-15759488323730311834?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_16_22-5100222963713141427?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user dawid@getindata.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1197

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1197/display/redirect?page=changes>

Changes:

[wcn] Fix documentation around pipeline creation.

------------------------------------------
[...truncated 1.38 MB...]
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s30", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_merge_tagged_vals_under_key"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/Group/Map(_merge_tagged_vals_under_key).out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s29"
        }, 
        "serialized_fn": "<string of 1380 bytes>", 
        "user_name": "assert:even/Group/Map(_merge_tagged_vals_under_key)"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s31", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s30"
        }, 
        "serialized_fn": "<string of 980 bytes>", 
        "user_name": "assert:even/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s32", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s31"
        }, 
        "serialized_fn": "<string of 1148 bytes>", 
        "user_name": "assert:even/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T20:43:53.229117Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_13_43_51-10392835747257325335'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327204338-336520'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_13_43_51-10392835747257325335]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_51-10392835747257325335?project=apache-beam-testing
root: INFO: Job 2018-03-27_13_43_51-10392835747257325335 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T20:43:51.980Z: JOB_MESSAGE_WARNING: Job 2018-03-27_13_43_51-10392835747257325335 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T20:43:51.997Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_13_43_51-10392835747257325335. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T20:43:52.008Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_13_43_51-10392835747257325335.
root: INFO: 2018-03-27T20:43:55.405Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T20:43:55.571Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T20:43:57.256Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/1422 instances, 1/47 CPUs, 250/150 disk GB, 0/1998 SSD disk GB, 1/72 instance groups, 1/22 managed instance groups, 1/48 instance templates, 1/274 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
root: INFO: 2018-03-27T20:43:57.465Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T20:43:57.587Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1257.223s

FAILED (errors=13, failures=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_52-11000115026759003252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_45_50-15530159881638042658?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_55_51-10602826279174943190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_57_31-8219724657153712630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_51-10392835747257325335?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_44_09-5754332455186804390?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_46_06-17145157875702262217?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_55_07-759700236097906936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_02_18-6393606172838795216?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_51-660648829442181237?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_45_56-7539489058280770329?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_54_09-6584076154273951018?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_51-5279905424225740611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_44_08-11272382841059791538?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_46_00-16648436159664885353?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_55_50-10225882695426065704?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user dawid@getindata.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com