You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/03/24 15:33:16 UTC

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1177

See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1177/display/redirect>

------------------------------------------
[...truncated 766.77 KB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "kind:bytes"
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s4"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-24T15:24:41.498026Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-24_08_24_40-3584323830304294032'
 location: u'us-central1'
 name: u'beamapp-jenkins-0324152430-404974'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-24_08_24_40-3584323830304294032]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_24_40-3584323830304294032?project=apache-beam-testing
root: INFO: Job 2018-03-24_08_24_40-3584323830304294032 is in state JOB_STATE_PENDING
root: INFO: 2018-03-24T15:24:40.472Z: JOB_MESSAGE_WARNING: Job 2018-03-24_08_24_40-3584323830304294032 might autoscale up to 1000 workers.
root: INFO: 2018-03-24T15:24:40.497Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-24_08_24_40-3584323830304294032. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-24T15:24:40.530Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-24_08_24_40-3584323830304294032.
root: INFO: 2018-03-24T15:24:43.220Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-24T15:24:43.330Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-24T15:24:44.343Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-24T15:24:44.376Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-24T15:24:44.407Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-24T15:24:44.440Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-24T15:24:44.458Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-24T15:24:44.498Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-24T15:24:44.531Z: JOB_MESSAGE_DETAILED: Unzipping flatten s13 for input s11.out
root: INFO: 2018-03-24T15:24:44.564Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2018-03-24T15:24:44.598Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-24T15:24:44.631Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-24T15:24:44.659Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-24T15:24:44.684Z: JOB_MESSAGE_DETAILED: Unzipping flatten s13-u13 for input s14-reify-value0-c11
root: INFO: 2018-03-24T15:24:44.719Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s13-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-24T15:24:44.750Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-24T15:24:44.776Z: JOB_MESSAGE_DETAILED: Created new flatten s4-c17 to unzip producers of s18
root: INFO: 2018-03-24T15:24:44.800Z: JOB_MESSAGE_DETAILED: Unzipping flatten s4-c17 for input s2.out
root: INFO: 2018-03-24T15:24:44.815Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0, through flatten Flatten, into producer side1/Read
root: INFO: 2018-03-24T15:24:44.843Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-24T15:24:44.859Z: JOB_MESSAGE_DETAILED: Unzipping flatten s4 for input s2.out
root: INFO: 2018-03-24T15:24:44.892Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0, through flatten Flatten, into producer side1/Read
root: INFO: 2018-03-24T15:24:44.920Z: JOB_MESSAGE_DETAILED: Fusing consumer FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0 into side2/Read
root: INFO: 2018-03-24T15:24:44.952Z: JOB_MESSAGE_DETAILED: Fusing consumer FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0 into side2/Read
root: INFO: 2018-03-24T15:24:44.986Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-24T15:24:45.020Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2018-03-24T15:24:45.043Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-24T15:24:45.076Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into FlatMap(<lambda at sideinputs_test.py:302>)/FlatMap(<lambda at sideinputs_test.py:302>)
root: INFO: 2018-03-24T15:24:45.104Z: JOB_MESSAGE_DETAILED: Fusing consumer FlatMap(<lambda at sideinputs_test.py:302>)/FlatMap(<lambda at sideinputs_test.py:302>) into main input/Read
root: INFO: 2018-03-24T15:24:45.137Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-24T15:24:45.164Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-24T15:24:45.198Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-24T15:24:45.232Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-24T15:24:45.264Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-24T15:24:45.397Z: JOB_MESSAGE_DEBUG: Executing wait step start35
root: INFO: 2018-03-24T15:24:45.456Z: JOB_MESSAGE_BASIC: Executing operation side2/Read+FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0+FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0
root: INFO: 2018-03-24T15:24:45.481Z: JOB_MESSAGE_BASIC: Executing operation side1/Read+FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0+FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0
root: INFO: 2018-03-24T15:24:45.512Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-24T15:24:45.673Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-24T15:24:45.691Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: Job 2018-03-24_08_24_40-3584323830304294032 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-24T15:24:45.890Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-24T15:24:45.953Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-24T15:24:54.079Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-24T15:25:42.091Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-24T15:25:59.295Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-24T15:30:44.971Z: JOB_MESSAGE_DEBUG: Value "FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-24T15:30:48.613Z: JOB_MESSAGE_DEBUG: Value "FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-24T15:30:48.674Z: JOB_MESSAGE_BASIC: Executing operation s4-u26
root: INFO: 2018-03-24T15:30:48.820Z: JOB_MESSAGE_DEBUG: Value "FlatMap(<lambda at sideinputs_test.py:302>)/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-24T15:30:48.879Z: JOB_MESSAGE_BASIC: Executing operation FlatMap(<lambda at sideinputs_test.py:302>)/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-24T15:30:48.986Z: JOB_MESSAGE_DEBUG: Value "FlatMap(<lambda at sideinputs_test.py:302>)/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-24T15:30:49.050Z: JOB_MESSAGE_BASIC: Executing operation main input/Read+FlatMap(<lambda at sideinputs_test.py:302>)/FlatMap(<lambda at sideinputs_test.py:302>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-24T15:30:53.585Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-24T15:30:56.975Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-24T15:31:00.377Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-24T15:31:03.751Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-24T15:31:03.789Z: JOB_MESSAGE_DEBUG: Executing failure step failure34
root: INFO: 2018-03-24T15:31:03.824Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S07:main input/Read+FlatMap(<lambda at sideinputs_test.py:302>)/FlatMap(<lambda at sideinputs_test.py:302>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032415243-03240824-4ec5-harness-39hh,
  beamapp-jenkins-032415243-03240824-4ec5-harness-39hh,
  beamapp-jenkins-032415243-03240824-4ec5-harness-39hh,
  beamapp-jenkins-032415243-03240824-4ec5-harness-39hh
root: INFO: 2018-03-24T15:31:03.933Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-24T15:31:03.975Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-24T15:31:04.002Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-24T15:32:29.826Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-24T15:32:29.875Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-24_08_24_40-3584323830304294032 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1947.955s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_00_43-9736913346032438818?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_09_18-222824190131179099?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_17_08-5997926335616161256?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_24_40-3584323830304294032?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_00_46-4370792559336822381?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_08_09-3522514824911455189?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_15_15-14954749041921146254?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_25_10-1248260207067418933?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_00_45-11900728556721108451?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_09_07-9760510945719394510?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_17_33-15344652913525069588?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_24_59-16756136481250449402?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_00_44-16557345865703643069?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_08_20-9830032670753727005?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_16_01-5721463290046410012?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_08_24_12-10221863772801993148?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Jenkins build is back to normal : beam_PostCommit_Python_ValidatesRunner_Dataflow #1200

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1200/display/redirect?page=changes>


Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1199

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1199/display/redirect?page=changes>

Changes:

[tgroh] Cleanups in GroupAlsoByWindowEvaluatorFactory

[tgroh] Allow Fusion to Continue with unknown PTransforms

[tgroh] fixup! Allow Fusion to Continue with unknown PTransforms

[tgroh] fixup! fixup! Allow Fusion to Continue with unknown PTransforms

[chamikara] [BEAM-3744] Expand Pubsub read API for Python. (#4901)

------------------------------------------
[...truncated 729.03 KB...]
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 980 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s15", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s14"
        }, 
        "serialized_fn": "<string of 1172 bytes>", 
        "user_name": "assert_that/Match"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s16", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T23:09:59.999477Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_16_09_58-12626702455421293246'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327230949-492200'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_16_09_58-12626702455421293246]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_09_58-12626702455421293246?project=apache-beam-testing
root: INFO: Job 2018-03-27_16_09_58-12626702455421293246 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T23:09:58.939Z: JOB_MESSAGE_WARNING: Job 2018-03-27_16_09_58-12626702455421293246 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T23:09:58.969Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_16_09_58-12626702455421293246. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T23:09:58.996Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_16_09_58-12626702455421293246.
root: INFO: 2018-03-27T23:10:01.860Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T23:10:01.969Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T23:10:03.136Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/1421 instances, 1/46 CPUs, 250/150 disk GB, 0/1998 SSD disk GB, 1/66 instance groups, 1/16 managed instance groups, 1/40 instance templates, 1/273 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1545.509s

FAILED (errors=6, failures=3)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_15_53_43-11482671923852177572?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_01_01-4322332307887274284?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_08_47-11830690090467400710?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_09_03-7990827244325842519?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_10_33-15883073858780408719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_15_53_38-7423982132368121081?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_01_55-2442305553573735477?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_09_43-9919515862997320778?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_09_58-12626702455421293246?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_10_14-4988486362262838552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_15_53_39-10837416903996163751?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_01_01-602443222349780264?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_11_07-17436675794668597233?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_15_53_38-6113242011989029334?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_00_55-13454450178281775119?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_16_07_32-1095996205026448340?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user dawid@getindata.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1198

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1198/display/redirect>

------------------------------------------
[...truncated 979.90 KB...]
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "concatenate/MapToVoidKey1.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s3"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "concatenate/MapToVoidKey1"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T21:12:43.344355Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_14_12_42-8666691247344616596'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327211232-279365'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_14_12_42-8666691247344616596]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_12_42-8666691247344616596?project=apache-beam-testing
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T21:12:42.408Z: JOB_MESSAGE_WARNING: Job 2018-03-27_14_12_42-8666691247344616596 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T21:12:42.442Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_14_12_42-8666691247344616596. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T21:12:42.479Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_14_12_42-8666691247344616596.
root: INFO: 2018-03-27T21:12:45.703Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T21:12:45.823Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T21:12:46.601Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T21:12:46.645Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T21:12:46.685Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T21:12:46.777Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T21:12:46.811Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T21:12:46.901Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T21:12:46.925Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/MapToVoidKey1 into side pairs/Read
root: INFO: 2018-03-27T21:12:46.960Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/MapToVoidKey1 into side pairs/Read
root: INFO: 2018-03-27T21:12:47.037Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/MapToVoidKey0 into side list/Read
root: INFO: 2018-03-27T21:12:47.095Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/MapToVoidKey0 into side list/Read
root: INFO: 2018-03-27T21:12:47.190Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T21:12:47.225Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T21:12:47.304Z: JOB_MESSAGE_DETAILED: Unzipping flatten s14 for input s12.out
root: INFO: 2018-03-27T21:12:47.343Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2018-03-27T21:12:47.411Z: JOB_MESSAGE_DETAILED: Unzipping flatten s14-u13 for input s15-reify-value0-c11
root: INFO: 2018-03-27T21:12:47.445Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s14-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T21:12:47.484Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T21:12:47.568Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T21:12:47.608Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T21:12:47.693Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2018-03-27T21:12:47.766Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T21:12:47.794Z: JOB_MESSAGE_DETAILED: Fusing consumer concatenate/concatenate into main input/Read
root: INFO: 2018-03-27T21:12:47.833Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into concatenate/concatenate
root: INFO: 2018-03-27T21:12:47.904Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T21:12:47.935Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T21:12:48.016Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T21:12:48.047Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T21:12:48.076Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T21:12:48.150Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T21:12:48.363Z: JOB_MESSAGE_DEBUG: Executing wait step start23
root: INFO: 2018-03-27T21:12:48.436Z: JOB_MESSAGE_BASIC: Executing operation side pairs/Read+concatenate/MapToVoidKey1+concatenate/MapToVoidKey1
root: INFO: 2018-03-27T21:12:48.523Z: JOB_MESSAGE_BASIC: Executing operation side list/Read+concatenate/MapToVoidKey0+concatenate/MapToVoidKey0
root: INFO: 2018-03-27T21:12:48.536Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T21:12:48.559Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T21:12:48.562Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-27T21:12:48.745Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T21:12:48.806Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T21:12:58.494Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T21:13:14.356Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T21:15:21.834Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T21:18:38.828Z: JOB_MESSAGE_DEBUG: Value "concatenate/MapToVoidKey1.out" materialized.
root: INFO: 2018-03-27T21:18:38.911Z: JOB_MESSAGE_BASIC: Executing operation concatenate/_DataflowIterableSideInput(MapToVoidKey1.out.0)
root: INFO: 2018-03-27T21:18:39.035Z: JOB_MESSAGE_DEBUG: Value "concatenate/_DataflowIterableSideInput(MapToVoidKey1.out.0).output" materialized.
root: INFO: 2018-03-27T21:18:57.854Z: JOB_MESSAGE_DEBUG: Value "concatenate/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T21:18:57.930Z: JOB_MESSAGE_BASIC: Executing operation concatenate/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T21:18:58.046Z: JOB_MESSAGE_DEBUG: Value "concatenate/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T21:18:58.123Z: JOB_MESSAGE_BASIC: Executing operation main input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T21:19:02.856Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T21:19:06.362Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T21:19:08.751Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T21:19:09.173Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T21:19:09.223Z: JOB_MESSAGE_DEBUG: Executing failure step failure22
root: INFO: 2018-03-27T21:19:09.257Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S07:main input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032721123-03271412-8153-harness-58mk,
  beamapp-jenkins-032721123-03271412-8153-harness-58mk,
  beamapp-jenkins-032721123-03271412-8153-harness-58mk,
  beamapp-jenkins-032721123-03271412-8153-harness-58mk
root: INFO: 2018-03-27T21:19:09.382Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T21:19:09.435Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T21:19:09.471Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T21:20:47.349Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T21:20:47.445Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 929.102s

FAILED (errors=12)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-6766723166623993968?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_31-10929177711809778955?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_15_16-16354086526836046220?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-1201976823224416842?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_07_27-7271190073261436645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_09_20-12600433319759080049?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_10_50-798850488600387760?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_12_42-8666691247344616596?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-1765600968155479915?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_26-10060460114915233142?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_14_51-11698478401267540067?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_16_36-2122733496028554464?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-16039646830469645721?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_00-6879600078595900695?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_14_46-15759488323730311834?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_16_22-5100222963713141427?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user dawid@getindata.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1197

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1197/display/redirect?page=changes>

Changes:

[wcn] Fix documentation around pipeline creation.

------------------------------------------
[...truncated 1.38 MB...]
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s30", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_merge_tagged_vals_under_key"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/Group/Map(_merge_tagged_vals_under_key).out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s29"
        }, 
        "serialized_fn": "<string of 1380 bytes>", 
        "user_name": "assert:even/Group/Map(_merge_tagged_vals_under_key)"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s31", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s30"
        }, 
        "serialized_fn": "<string of 980 bytes>", 
        "user_name": "assert:even/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s32", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s31"
        }, 
        "serialized_fn": "<string of 1148 bytes>", 
        "user_name": "assert:even/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T20:43:53.229117Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_13_43_51-10392835747257325335'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327204338-336520'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_13_43_51-10392835747257325335]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_51-10392835747257325335?project=apache-beam-testing
root: INFO: Job 2018-03-27_13_43_51-10392835747257325335 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T20:43:51.980Z: JOB_MESSAGE_WARNING: Job 2018-03-27_13_43_51-10392835747257325335 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T20:43:51.997Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_13_43_51-10392835747257325335. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T20:43:52.008Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_13_43_51-10392835747257325335.
root: INFO: 2018-03-27T20:43:55.405Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T20:43:55.571Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T20:43:57.256Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/1422 instances, 1/47 CPUs, 250/150 disk GB, 0/1998 SSD disk GB, 1/72 instance groups, 1/22 managed instance groups, 1/48 instance templates, 1/274 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
root: INFO: 2018-03-27T20:43:57.465Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T20:43:57.587Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1257.223s

FAILED (errors=13, failures=2)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_52-11000115026759003252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_45_50-15530159881638042658?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_55_51-10602826279174943190?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_57_31-8219724657153712630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_51-10392835747257325335?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_44_09-5754332455186804390?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_46_06-17145157875702262217?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_55_07-759700236097906936?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_02_18-6393606172838795216?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_51-660648829442181237?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_45_56-7539489058280770329?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_54_09-6584076154273951018?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_43_51-5279905424225740611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_44_08-11272382841059791538?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_46_00-16648436159664885353?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_13_55_50-10225882695426065704?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user dawid@getindata.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1196

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1196/display/redirect?page=changes>

Changes:

[dawid] [BEAM-2831] Do not wrap IOException in SerializableCoder

------------------------------------------
[...truncated 780.64 KB...]
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T18:56:06.741465Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_11_56_05-5490453045814037913'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327185556-630164'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_11_56_05-5490453045814037913]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_56_05-5490453045814037913?project=apache-beam-testing
root: INFO: Job 2018-03-27_11_56_05-5490453045814037913 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T18:56:05.807Z: JOB_MESSAGE_WARNING: Job 2018-03-27_11_56_05-5490453045814037913 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T18:56:05.844Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_11_56_05-5490453045814037913. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T18:56:05.873Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_11_56_05-5490453045814037913.
root: INFO: 2018-03-27T18:56:08.331Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T18:56:08.630Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T18:56:09.538Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T18:56:09.578Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T18:56:09.607Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T18:56:09.633Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T18:56:09.647Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T18:56:09.687Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T18:56:09.715Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-27T18:56:09.745Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-27T18:56:09.773Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T18:56:09.806Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T18:56:09.840Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T18:56:09.866Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T18:56:09.898Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-27T18:56:09.924Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T18:56:09.957Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T18:56:09.990Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T18:56:10.016Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T18:56:10.037Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-27T18:56:10.071Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T18:56:10.104Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-27T18:56:10.127Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T18:56:10.154Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-27T18:56:10.185Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T18:56:10.222Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T18:56:10.243Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T18:56:10.279Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T18:56:10.314Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T18:56:10.454Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-27T18:56:10.518Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-27T18:56:10.549Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T18:56:10.561Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T18:56:10.594Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-27T18:56:10.695Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-27T18:56:10.754Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-27_11_56_05-5490453045814037913 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T18:56:20.304Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T18:56:36.922Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T18:56:57.132Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T19:01:48.657Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T19:01:48.746Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T19:01:48.876Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T19:01:48.957Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T19:01:54.559Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T19:01:57.932Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T19:02:01.344Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T19:02:04.718Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T19:02:04.771Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-27T19:02:04.807Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032718555-03271156-0ad2-harness-7x7m,
  beamapp-jenkins-032718555-03271156-0ad2-harness-7x7m,
  beamapp-jenkins-032718555-03271156-0ad2-harness-7x7m,
  beamapp-jenkins-032718555-03271156-0ad2-harness-7x7m
root: INFO: 2018-03-27T19:02:04.935Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T19:02:04.994Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T19:02:05.043Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T19:03:25.077Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T19:03:25.119Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-27T19:03:25.164Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-27_11_56_05-5490453045814037913 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 2022.987s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_32_03-16150791362815293717?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_39_24-3883066675766862274?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_46_10-12596416067968885824?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_57_51-3317164318033171820?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_32_03-13075561341181641056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_39_43-5093947020766332411?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_47_40-1498724586845556160?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_54_41-4436569945421858515?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_32_03-16283634229691891067?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_39_44-12252465556747743794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_47_19-1033570348734285805?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_56_05-5490453045814037913?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_32_04-7598690882008751707?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_39_20-1312637914002205353?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_46_30-8970544101708334669?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_11_53_36-11618455753502044909?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user dawid@getindata.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1195

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1195/display/redirect>

------------------------------------------
[...truncated 777.64 KB...]
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T15:23:56.197610Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_08_23_55-3650033612610576768'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327152346-413512'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_08_23_55-3650033612610576768]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_23_55-3650033612610576768?project=apache-beam-testing
root: INFO: Job 2018-03-27_08_23_55-3650033612610576768 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T15:23:55.089Z: JOB_MESSAGE_WARNING: Job 2018-03-27_08_23_55-3650033612610576768 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T15:23:55.113Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_08_23_55-3650033612610576768. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T15:23:55.141Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_08_23_55-3650033612610576768.
root: INFO: 2018-03-27T15:23:58.350Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T15:23:58.662Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T15:24:00.060Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T15:24:00.125Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T15:24:00.169Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T15:24:00.215Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T15:24:00.258Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T15:24:00.347Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T15:24:00.398Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-27T15:24:00.440Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-27T15:24:00.475Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T15:24:00.517Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T15:24:00.559Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T15:24:00.599Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T15:24:00.634Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-27T15:24:00.672Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T15:24:00.710Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T15:24:00.750Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T15:24:00.787Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T15:24:00.818Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-27T15:24:00.864Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T15:24:00.904Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-27T15:24:00.962Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T15:24:01.005Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-27T15:24:01.050Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T15:24:01.102Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T15:24:01.155Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T15:24:01.184Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T15:24:01.228Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: Job 2018-03-27_08_23_55-3650033612610576768 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T15:24:01.482Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-27T15:24:01.596Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-27T15:24:01.638Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T15:24:01.652Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T15:24:01.694Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-27T15:24:01.899Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-27T15:24:01.985Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T15:24:11.585Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T15:24:27.786Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T15:26:26.611Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T15:29:27.908Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T15:29:27.975Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T15:29:28.085Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T15:29:28.169Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T15:29:33.782Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T15:29:37.215Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T15:29:40.597Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T15:29:44.006Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T15:29:44.116Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-27T15:29:44.151Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032715234-03270823-a54d-harness-4sh5,
  beamapp-jenkins-032715234-03270823-a54d-harness-4sh5,
  beamapp-jenkins-032715234-03270823-a54d-harness-4sh5,
  beamapp-jenkins-032715234-03270823-a54d-harness-4sh5
root: INFO: 2018-03-27T15:29:44.351Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T15:29:44.486Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T15:29:44.523Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T15:31:12.742Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T15:31:12.923Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-27_08_23_55-3650033612610576768 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1845.235s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_00_54-12940354374521534737?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_08_15-13219739912963984565?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_16_19-4728816782740441528?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_23_55-3650033612610576768?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_00_54-14754852243713532831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_08_20-4728151698816374518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_16_20-4825376678519542353?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_23_25-14848662130832923357?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_00_54-3244734529970046186?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_08_00-12264153707764791302?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_14_51-15792549337632610991?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_24_06-936302361190485202?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_00_57-1262055026506098447?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_08_04-2552536034214538719?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_15_24-2558110215161018683?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_08_22_44-1966495943912139555?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1194

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1194/display/redirect>

------------------------------------------
[...truncated 778.32 KB...]
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T09:53:06.918417Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_02_53_05-13285974161822679219'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327095247-802111'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_02_53_05-13285974161822679219]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_53_05-13285974161822679219?project=apache-beam-testing
root: INFO: Job 2018-03-27_02_53_05-13285974161822679219 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T09:53:05.856Z: JOB_MESSAGE_WARNING: Job 2018-03-27_02_53_05-13285974161822679219 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T09:53:05.882Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_02_53_05-13285974161822679219. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T09:53:05.893Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_02_53_05-13285974161822679219.
root: INFO: 2018-03-27T09:53:08.652Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T09:53:08.847Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T09:53:10.042Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T09:53:10.085Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T09:53:10.113Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T09:53:10.151Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T09:53:10.188Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T09:53:10.231Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T09:53:10.265Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-27T09:53:10.292Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-27T09:53:10.327Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T09:53:10.357Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T09:53:10.408Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T09:53:10.436Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T09:53:10.468Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-27T09:53:10.501Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T09:53:10.533Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T09:53:10.563Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T09:53:10.596Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T09:53:10.634Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-27T09:53:10.697Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T09:53:10.728Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-27T09:53:10.764Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T09:53:10.812Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-27T09:53:10.845Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T09:53:10.890Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T09:53:10.926Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T09:53:10.956Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T09:53:10.990Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T09:53:11.133Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: Job 2018-03-27_02_53_05-13285974161822679219 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T09:53:11.207Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-27T09:53:11.223Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T09:53:11.237Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T09:53:11.260Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-27T09:53:11.335Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-27T09:53:11.396Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T09:53:23.045Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T09:53:44.951Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T09:55:50.980Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T09:58:57.081Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T09:58:57.137Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T09:58:57.247Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T09:58:57.308Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T09:59:06.138Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T09:59:09.521Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T09:59:12.907Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T09:59:16.283Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T09:59:16.328Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-27T09:59:16.360Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032709524-03270253-875e-harness-814j,
  beamapp-jenkins-032709524-03270253-875e-harness-814j,
  beamapp-jenkins-032709524-03270253-875e-harness-814j,
  beamapp-jenkins-032709524-03270253-875e-harness-814j
root: INFO: 2018-03-27T09:59:16.458Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T09:59:16.503Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T09:59:16.538Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T10:00:53.763Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T10:00:53.816Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-27_02_53_05-13285974161822679219 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1838.649s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_30_36-18376419209367674554?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_37_29-18163153546273106202?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_44_25-673395053523851856?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_53_05-13285974161822679219?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_30_37-2145033488735241302?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_37_52-2551443379775041278?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_45_08-5497245918978315579?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_52_47-12842283732413362134?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_30_37-18011235395312602819?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_37_48-2899955956079877693?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_46_00-12158861284466589981?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_52_51-18043010017828202583?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_30_37-8378309140968669157?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_37_24-982620556526332552?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_43_50-12789365986196237977?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_02_53_40-807928402401666723?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1193

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1193/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-3931] Remove commons-text dependency from Spark runner

------------------------------------------
[...truncated 713.90 KB...]
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 980 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s15", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s14"
        }, 
        "serialized_fn": "<string of 1172 bytes>", 
        "user_name": "assert_that/Match"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s16", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T07:30:41.396264Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-27_00_30_40-6732794308576493871'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327073032-160059'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_00_30_40-6732794308576493871]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_30_40-6732794308576493871?project=apache-beam-testing
root: INFO: Job 2018-03-27_00_30_40-6732794308576493871 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T07:30:40.528Z: JOB_MESSAGE_WARNING: Job 2018-03-27_00_30_40-6732794308576493871 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T07:30:40.553Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-27_00_30_40-6732794308576493871. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T07:30:40.575Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-27_00_30_40-6732794308576493871.
root: INFO: 2018-03-27T07:30:43.315Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T07:30:43.486Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T07:30:44.589Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/1423 instances, 1/45 CPUs, 250/70 disk GB, 0/1998 SSD disk GB, 1/65 instance groups, 1/15 managed instance groups, 1/40 instance templates, 1/275 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
root: INFO: 2018-03-27T07:30:44.672Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T07:30:44.801Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1340.637s

FAILED (errors=6, failures=3)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_16_16-14552909917666893795?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_23_01-13594218865195527655?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_29_57-3664658460935027593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_30_13-13705164797134355925?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_31_47-6662073015949190515?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_16_15-16202684994640544453?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_23_16-6352041659515829613?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_16_15-9570628382451847279?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_23_12-9778125197842109378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_30_07-3281434734603837245?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_30_22-18210214244420594446?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_31_51-5730046260210560105?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_16_15-11718059629317674750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_23_35-11055644641056431970?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_30_40-6732794308576493871?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_00_30_55-5145425812046485663?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1192

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1192/display/redirect?page=changes>

Changes:

[aljoscha.krettek] [BEAM-622] Add checkpointing tests for DoFnOperator and

[aljoscha.krettek] [BEAM-3087] Make reader state update and element emission atomic

[aljoscha.krettek] [BEAM-2393] Make BoundedSource fault-tolerant

------------------------------------------
[...truncated 1.37 MB...]
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T06:56:09.016379Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_23_56_08-2624919969539703079'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327065558-806915'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_23_56_08-2624919969539703079]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_56_08-2624919969539703079?project=apache-beam-testing
root: INFO: Job 2018-03-26_23_56_08-2624919969539703079 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T06:56:08.136Z: JOB_MESSAGE_WARNING: Job 2018-03-26_23_56_08-2624919969539703079 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T06:56:08.167Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_23_56_08-2624919969539703079. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T06:56:08.183Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_23_56_08-2624919969539703079.
root: INFO: 2018-03-27T06:56:10.751Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T06:56:10.915Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T06:56:11.853Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T06:56:11.880Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T06:56:11.912Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T06:56:11.935Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T06:56:11.959Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T06:56:11.999Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T06:56:12.025Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-27T06:56:12.047Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-27T06:56:12.079Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T06:56:12.101Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T06:56:12.135Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T06:56:12.166Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T06:56:12.194Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-27T06:56:12.223Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T06:56:12.251Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T06:56:12.281Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T06:56:12.306Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T06:56:12.334Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-27T06:56:12.366Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T06:56:12.397Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-27T06:56:12.427Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T06:56:12.452Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-27T06:56:12.484Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T06:56:12.511Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T06:56:12.529Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T06:56:12.561Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T06:56:12.586Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T06:56:12.728Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-27T06:56:12.781Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-27T06:56:12.815Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: Job 2018-03-26_23_56_08-2624919969539703079 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T06:56:14.144Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T06:56:14.171Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2018-03-27T06:56:14.255Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-27T06:56:14.330Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T06:56:23.992Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T06:56:45.124Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T06:57:03.200Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T07:01:13.360Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T07:01:13.405Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T07:01:13.533Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T07:01:13.596Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T07:01:22.347Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T07:01:25.731Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T07:01:29.108Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T07:01:32.488Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T07:01:32.535Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-27T07:01:32.564Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032706555-03262356-4138-harness-z7rl,
  beamapp-jenkins-032706555-03262356-4138-harness-z7rl,
  beamapp-jenkins-032706555-03262356-4138-harness-z7rl,
  beamapp-jenkins-032706555-03262356-4138-harness-z7rl
root: INFO: 2018-03-27T07:01:32.671Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T07:01:32.717Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T07:01:32.736Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T07:02:59.101Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T07:02:59.162Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-26_23_56_08-2624919969539703079 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1238.793s

FAILED (errors=15)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_43_46-7344238344133407944?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_45_21-11540249585407662925?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_46_57-10461490287022241126?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_56_08-2624919969539703079?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_43_46-12003192190356716845?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_45_27-10062326548984746461?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_46_52-1985383217991432145?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_53_48-15173757907780720832?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_43_46-9598271915551361368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_45_31-17110116506461070297?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_47_06-12664660555851172672?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_54_01-15167584006607682798?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_43_46-12303939583333714506?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_45_26-11909229951007724967?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_46_51-14290712548855909004?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_23_56_31-18250439990436689743?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user aljoscha.krettek@gmail.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1191

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1191/display/redirect?page=changes>

Changes:

[ehudm] Fix test_pre_finalize_error to test exceptions.

------------------------------------------
[...truncated 779.49 KB...]
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T04:15:00.742405Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_21_14_59-1408162820328540407'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327041450-817077'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_21_14_59-1408162820328540407]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_14_59-1408162820328540407?project=apache-beam-testing
root: INFO: Job 2018-03-26_21_14_59-1408162820328540407 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T04:14:59.909Z: JOB_MESSAGE_WARNING: Job 2018-03-26_21_14_59-1408162820328540407 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T04:14:59.937Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_21_14_59-1408162820328540407. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T04:14:59.953Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_21_14_59-1408162820328540407.
root: INFO: 2018-03-27T04:15:02.630Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T04:15:02.912Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T04:15:03.662Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T04:15:03.696Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T04:15:03.721Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T04:15:03.860Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T04:15:03.898Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T04:15:03.952Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T04:15:03.979Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-27T04:15:04.003Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-27T04:15:04.028Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T04:15:04.050Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T04:15:04.081Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T04:15:04.102Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T04:15:04.135Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-27T04:15:04.171Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T04:15:04.201Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T04:15:04.225Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-27T04:15:04.257Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T04:15:04.294Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-27T04:15:04.328Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T04:15:04.362Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-27T04:15:04.394Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T04:15:04.427Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-27T04:15:04.458Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T04:15:04.493Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T04:15:04.524Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T04:15:04.557Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T04:15:04.584Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T04:15:04.723Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-27T04:15:04.796Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-27T04:15:04.830Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T04:15:04.842Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T04:15:04.875Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: Job 2018-03-26_21_14_59-1408162820328540407 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T04:15:04.969Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-27T04:15:05.026Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T04:15:12.282Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T04:15:28.082Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T04:15:43.678Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T04:19:49.070Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T04:19:49.122Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T04:19:49.254Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T04:19:49.330Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T04:19:54.830Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T04:19:58.197Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T04:20:01.573Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T04:20:04.959Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T04:20:05.005Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-27T04:20:05.030Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032704145-03262114-45e4-harness-wsmw,
  beamapp-jenkins-032704145-03262114-45e4-harness-wsmw,
  beamapp-jenkins-032704145-03262114-45e4-harness-wsmw,
  beamapp-jenkins-032704145-03262114-45e4-harness-wsmw
root: INFO: 2018-03-27T04:20:05.145Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T04:20:05.196Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T04:20:05.217Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T04:21:27.160Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T04:21:27.336Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-27T04:21:27.391Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-26_21_14_59-1408162820328540407 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1753.406s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_52_41-14671403459208336604?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_59_31-13009269395438604775?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_06_25-30882582973964630?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_17_46-14352425261908776870?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_52_40-12108210991989281305?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_00_06-5697756324447819393?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_07_14-13322938594408528233?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_14_12-769749803788843265?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_52_41-8239223803523021163?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_59_49-1434425912239289071?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_06_56-15783997383633478113?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_14_09-8512061194222107056?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_52_40-5075005634496664639?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_59_57-15679975883589979313?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_07_49-3105827011802820288?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_21_14_59-1408162820328540407?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1190

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1190/display/redirect>

------------------------------------------
[...truncated 1.18 MB...]
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey1.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey1"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T03:28:21.274928Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_20_28_20-1071160417201048090'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327032759-762713'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_20_28_20-1071160417201048090]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_28_20-1071160417201048090?project=apache-beam-testing
root: INFO: Job 2018-03-26_20_28_20-1071160417201048090 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T03:28:20.553Z: JOB_MESSAGE_WARNING: Job 2018-03-26_20_28_20-1071160417201048090 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T03:28:20.583Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_20_28_20-1071160417201048090. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T03:28:20.604Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_20_28_20-1071160417201048090.
root: INFO: 2018-03-27T03:28:23.218Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T03:28:23.390Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T03:28:24.148Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T03:28:24.175Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-27T03:28:24.201Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T03:28:24.230Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T03:28:24.256Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-27T03:28:24.298Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T03:28:24.328Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey0 into side list/Read
root: INFO: 2018-03-27T03:28:24.360Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey1 into side list/Read
root: INFO: 2018-03-27T03:28:24.392Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey0 into side list/Read
root: INFO: 2018-03-27T03:28:24.415Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey1 into side list/Read
root: INFO: 2018-03-27T03:28:24.448Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T03:28:24.481Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T03:28:24.513Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T03:28:24.546Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T03:28:24.577Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T03:28:24.600Z: JOB_MESSAGE_DETAILED: Unzipping flatten s13 for input s11.out
root: INFO: 2018-03-27T03:28:24.632Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2018-03-27T03:28:24.657Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T03:28:24.681Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2018-03-27T03:28:24.703Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T03:28:24.736Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T03:28:24.768Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:234>)/Map(<lambda at sideinputs_test.py:234>) into main input/Read
root: INFO: 2018-03-27T03:28:24.802Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Map(<lambda at sideinputs_test.py:234>)/Map(<lambda at sideinputs_test.py:234>)
root: INFO: 2018-03-27T03:28:24.834Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-27T03:28:24.862Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-27T03:28:24.881Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-27T03:28:24.913Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T03:28:25.048Z: JOB_MESSAGE_DEBUG: Executing wait step start19
root: INFO: 2018-03-27T03:28:25.113Z: JOB_MESSAGE_BASIC: Executing operation side list/Read+Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey0+Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey1+Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey0+Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey1
root: INFO: 2018-03-27T03:28:25.144Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T03:28:25.157Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-27T03:28:25.187Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-27T03:28:25.267Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-27T03:28:25.325Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-26_20_28_20-1071160417201048090 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-27T03:28:34.367Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T03:28:50.248Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T03:30:30.613Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-27T03:33:18.403Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T03:33:18.436Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:234>)/MapToVoidKey1.out" materialized.
root: INFO: 2018-03-27T03:33:18.470Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:234>)/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T03:33:18.500Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:234>)/_DataflowIterableSideInput(MapToVoidKey1.out.0)
root: INFO: 2018-03-27T03:33:18.566Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:234>)/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-27T03:33:18.590Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:234>)/_DataflowIterableSideInput(MapToVoidKey1.out.0).output" materialized.
root: INFO: 2018-03-27T03:33:18.654Z: JOB_MESSAGE_BASIC: Executing operation main input/Read+Map(<lambda at sideinputs_test.py:234>)/Map(<lambda at sideinputs_test.py:234>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T03:33:26.193Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T03:33:29.566Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T03:33:33.012Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T03:33:36.394Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-27T03:33:36.449Z: JOB_MESSAGE_DEBUG: Executing failure step failure18
root: INFO: 2018-03-27T03:33:36.480Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S06:main input/Read+Map(<lambda at sideinputs_test.py:234>)/Map(<lambda at sideinputs_test.py:234>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032703275-03262028-4d65-harness-vnzn,
  beamapp-jenkins-032703275-03262028-4d65-harness-vnzn,
  beamapp-jenkins-032703275-03262028-4d65-harness-vnzn,
  beamapp-jenkins-032703275-03262028-4d65-harness-vnzn
root: INFO: 2018-03-27T03:33:36.597Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T03:33:36.647Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-27T03:33:36.672Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T03:34:52.696Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-27T03:34:52.738Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-27T03:34:52.779Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-26_20_28_20-1071160417201048090 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 947.739s

FAILED (errors=13)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_19_30-2728932787666292737?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_21_05-16159615220996811829?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_28_52-11935522298675282708?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_30_37-6791756631785984876?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_19_26-11009059240456899439?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_21_03-8648717772200894645?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_29_16-10401431865997665748?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_30_45-8571396170500225053?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_19_26-8147876335107415987?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_20_57-5292438948266701387?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_28_20-1071160417201048090?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_19_26-15885156946341907739?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_20_50-9794857383319155530?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_27_36-13366550775987173846?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_29_18-16344970600444716750?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_20_30_48-4737486524823469355?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1189

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1189/display/redirect?page=changes>

Changes:

[github] Updated to Ubuntu 16 version of python 2

------------------------------------------
[...truncated 829.59 KB...]
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s13"
        }, 
        "serialized_fn": "<string of 980 bytes>", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s15", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s14"
        }, 
        "serialized_fn": "<string of 1156 bytes>", 
        "user_name": "assert_that/Match"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s16", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "FlatMap(<lambda at sideinputs_test.py:165>)/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "FlatMap(<lambda at sideinputs_test.py:165>)/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-27T00:40:44.011929Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_17_40_42-17706661237554409652'
 location: u'us-central1'
 name: u'beamapp-jenkins-0327004035-197713'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_17_40_42-17706661237554409652]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_40_42-17706661237554409652?project=apache-beam-testing
root: INFO: Job 2018-03-26_17_40_42-17706661237554409652 is in state JOB_STATE_PENDING
root: INFO: 2018-03-27T00:40:43.025Z: JOB_MESSAGE_WARNING: Job 2018-03-26_17_40_42-17706661237554409652 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T00:40:43.055Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_17_40_42-17706661237554409652. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-27T00:40:43.078Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_17_40_42-17706661237554409652.
root: INFO: 2018-03-27T00:40:45.843Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-27T00:40:46.140Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-27T00:40:47.446Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/1422 instances, 1/44 CPUs, 250/220 disk GB, 0/1998 SSD disk GB, 1/75 instance groups, 1/25 managed instance groups, 1/50 instance templates, 1/274 in-use IP addresses.

Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
root: INFO: 2018-03-27T00:40:47.530Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T00:40:47.654Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 637.902s

FAILED (errors=7, failures=4)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_32_29-2246126519034634739?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_39_13-10839244268794322230?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_40_42-17706661237554409652?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_40_56-11327008838361430202?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_32_29-8645524531144855045?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_39_13-416113375986472529?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_41_07-4854235984433385144?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_32_29-6163489484772594712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_40_13-2939916998491982124?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_40_27-4502452192139226887?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_40_41-4328670043894794054?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_42_05-12258959855880504194?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_32_29-15333885736125094017?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_39_08-15435676800806207730?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_39_23-14211847503693925690?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_17_41_18-13106112451985220573?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user github@alasdairhodge.co.uk
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1188

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1188/display/redirect>

------------------------------------------
[...truncated 1.36 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-26T21:21:07.603171Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_14_21_06-17108805084813922870'
 location: u'us-central1'
 name: u'beamapp-jenkins-0326212058-493419'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_14_21_06-17108805084813922870]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_21_06-17108805084813922870?project=apache-beam-testing
root: INFO: Job 2018-03-26_14_21_06-17108805084813922870 is in state JOB_STATE_PENDING
root: INFO: 2018-03-26T21:21:06.702Z: JOB_MESSAGE_WARNING: Job 2018-03-26_14_21_06-17108805084813922870 might autoscale up to 1000 workers.
root: INFO: 2018-03-26T21:21:06.732Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_14_21_06-17108805084813922870. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-26T21:21:06.761Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_14_21_06-17108805084813922870.
root: INFO: 2018-03-26T21:21:10.043Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-26T21:21:10.423Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-26T21:21:11.251Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T21:21:11.289Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-26T21:21:11.323Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T21:21:11.366Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-26T21:21:11.411Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-26T21:21:11.504Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-26T21:21:11.545Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-26T21:21:11.580Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-26T21:21:11.600Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-26T21:21:11.634Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-26T21:21:11.670Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-26T21:21:11.704Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-26T21:21:11.746Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-26T21:21:11.782Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T21:21:11.812Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T21:21:11.842Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T21:21:11.878Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T21:21:11.905Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-26T21:21:11.930Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-26T21:21:11.959Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-26T21:21:11.993Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-26T21:21:12.027Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-26T21:21:12.061Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-26T21:21:12.096Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-26T21:21:12.126Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-26T21:21:12.154Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-26T21:21:12.178Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-26T21:21:12.379Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-26T21:21:12.455Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-26T21:21:12.490Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-26T21:21:12.504Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-26T21:21:12.556Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-26T21:21:12.677Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-26T21:21:12.764Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-26_14_21_06-17108805084813922870 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-26T21:21:20.295Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T21:21:36.180Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T21:21:56.288Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-26T21:27:09.618Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-26T21:27:09.699Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-26T21:27:09.833Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-26T21:27:09.905Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-26T21:27:15.490Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T21:27:18.929Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T21:27:22.380Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T21:27:25.768Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T21:27:25.844Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-26T21:27:25.889Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032621205-03261421-1fdb-harness-2wl9,
  beamapp-jenkins-032621205-03261421-1fdb-harness-2wl9,
  beamapp-jenkins-032621205-03261421-1fdb-harness-2wl9,
  beamapp-jenkins-032621205-03261421-1fdb-harness-2wl9
root: INFO: 2018-03-26T21:27:26.020Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-26T21:27:26.089Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-26T21:27:26.133Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-26T21:28:53.901Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T21:28:53.999Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-26_14_21_06-17108805084813922870 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1294.020s

FAILED (errors=15)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_09_36-2711388499129017433?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_11_33-913430165854588700?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_13_20-6060393131151100928?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_21_05-7651329441954021705?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_09_37-9176742404902960011?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_11_38-17579174897754211389?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_13_25-16549333738098717875?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_15_36-6913078881926490252?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_23_11-4071739514016195113?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_09_37-13097983460244710849?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_11_23-1841895393853627011?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_13_15-17989955090877917368?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_09_43-5038229703714642434?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_11_28-4160936675038254025?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_13_21-9205557068430918946?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_14_21_06-17108805084813922870?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1187

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1187/display/redirect>

------------------------------------------
[...truncated 781.16 KB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-26T15:36:38.693256Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_08_36_37-11844391878041993465'
 location: u'us-central1'
 name: u'beamapp-jenkins-0326153629-021878'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_08_36_37-11844391878041993465]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_36_37-11844391878041993465?project=apache-beam-testing
root: INFO: Job 2018-03-26_08_36_37-11844391878041993465 is in state JOB_STATE_PENDING
root: INFO: 2018-03-26T15:36:37.856Z: JOB_MESSAGE_WARNING: Job 2018-03-26_08_36_37-11844391878041993465 might autoscale up to 1000 workers.
root: INFO: 2018-03-26T15:36:37.880Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_08_36_37-11844391878041993465. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-26T15:36:37.909Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_08_36_37-11844391878041993465.
root: INFO: 2018-03-26T15:36:40.498Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-26T15:36:40.666Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-26T15:36:41.499Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T15:36:41.528Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-26T15:36:41.551Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T15:36:41.582Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-26T15:36:41.613Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-26T15:36:41.641Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-26T15:36:41.672Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-26T15:36:41.701Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-26T15:36:41.731Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-26T15:36:41.760Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-26T15:36:41.786Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-26T15:36:41.814Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-26T15:36:41.850Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-26T15:36:41.878Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T15:36:41.909Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T15:36:41.930Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T15:36:41.951Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T15:36:41.981Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-26T15:36:42.002Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-26T15:36:42.033Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-26T15:36:42.058Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-26T15:36:42.081Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-26T15:36:42.110Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-26T15:36:42.136Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-26T15:36:42.163Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-26T15:36:42.192Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-26T15:36:42.216Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-26T15:36:42.334Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-26T15:36:42.389Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-26T15:36:42.420Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-26T15:36:42.431Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-26T15:36:42.463Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-26T15:36:42.534Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-26T15:36:42.595Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-26_08_36_37-11844391878041993465 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-26T15:36:51.678Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T15:37:07.605Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T15:37:23.909Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-26T15:42:23.777Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-26T15:42:23.848Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-26T15:42:23.946Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-26T15:42:24.004Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-26T15:42:29.616Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T15:42:32.992Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T15:42:36.379Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T15:42:39.768Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T15:42:39.808Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-26T15:42:39.841Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032615362-03260836-0d20-harness-4zx8,
  beamapp-jenkins-032615362-03260836-0d20-harness-4zx8,
  beamapp-jenkins-032615362-03260836-0d20-harness-4zx8,
  beamapp-jenkins-032615362-03260836-0d20-harness-4zx8
root: INFO: 2018-03-26T15:42:39.953Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-26T15:42:39.982Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-26T15:42:40.013Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-26T15:44:14.503Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T15:44:14.565Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-26_08_36_37-11844391878041993465 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1933.142s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_12_42-9442968899409181095?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_20_12-2959735489907153547?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_30_02-5796624979671361767?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_37_21-10251943069512389978?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_12_41-8780448097799052230?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_19_47-13976019531983714821?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_26_57-16692961830854065577?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_36_37-11844391878041993465?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_12_42-8125239182464464037?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_20_08-14395164038581073611?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_27_33-1609268880341072873?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_35_03-14790759851746623504?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_12_42-14847805630141694172?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_20_22-9671694853212331175?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_28_31-4793532234989557345?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_08_36_17-8647728981481259444?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1186

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1186/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-3892] Make MetricQueryResults and related classes more

------------------------------------------
[...truncated 783.37 KB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-26T14:48:45.063521Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_07_48_43-14868869528598401715'
 location: u'us-central1'
 name: u'beamapp-jenkins-0326144834-527165'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_07_48_43-14868869528598401715]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_48_43-14868869528598401715?project=apache-beam-testing
root: INFO: Job 2018-03-26_07_48_43-14868869528598401715 is in state JOB_STATE_PENDING
root: INFO: 2018-03-26T14:48:43.946Z: JOB_MESSAGE_WARNING: Job 2018-03-26_07_48_43-14868869528598401715 might autoscale up to 1000 workers.
root: INFO: 2018-03-26T14:48:43.973Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_07_48_43-14868869528598401715. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-26T14:48:44.004Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_07_48_43-14868869528598401715.
root: INFO: 2018-03-26T14:48:46.939Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-26T14:48:47.107Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-26T14:48:48.051Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T14:48:48.083Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-26T14:48:48.116Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T14:48:48.150Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-26T14:48:48.179Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-26T14:48:48.233Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-26T14:48:48.259Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-26T14:48:48.289Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-26T14:48:48.326Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-26T14:48:48.362Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-26T14:48:48.397Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-26T14:48:48.439Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-26T14:48:48.484Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-26T14:48:48.517Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T14:48:48.548Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T14:48:48.584Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T14:48:48.618Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T14:48:48.647Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-26T14:48:48.683Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-26T14:48:48.704Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-26T14:48:48.739Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-26T14:48:48.771Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-26T14:48:48.804Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-26T14:48:48.843Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-26T14:48:48.876Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-26T14:48:48.905Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-26T14:48:48.934Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-26T14:48:49.118Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-26T14:48:49.202Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-26T14:48:49.226Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-26T14:48:49.238Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-26T14:48:49.271Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: Job 2018-03-26_07_48_43-14868869528598401715 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-26T14:48:49.400Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-26T14:48:49.466Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-26T14:48:56.935Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T14:49:12.728Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T14:51:08.142Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-26T14:54:14.420Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-26T14:54:14.511Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-26T14:54:14.755Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-26T14:54:14.837Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-26T14:54:20.472Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T14:54:23.853Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T14:54:27.232Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T14:54:30.674Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T14:54:30.750Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-26T14:54:30.782Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032614483-03260748-de48-harness-9jwf,
  beamapp-jenkins-032614483-03260748-de48-harness-9jwf,
  beamapp-jenkins-032614483-03260748-de48-harness-9jwf,
  beamapp-jenkins-032614483-03260748-de48-harness-9jwf
root: INFO: 2018-03-26T14:54:30.905Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-26T14:54:30.975Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-26T14:54:31.009Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-26T14:55:53.791Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T14:55:53.860Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-26_07_48_43-14868869528598401715 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1968.004s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_23_31-13738224570470267327?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_31_17-1438420644379725267?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_39_13-15574060940370609993?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_48_43-14868869528598401715?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_23_31-11557225536688757301?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_31_00-14279315547290390721?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_38_26-11114078747600057852?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_47_46-11428996268807468197?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_23_31-10825278497724174033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_30_47-14866416867615994262?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_37_52-3348175044127285866?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_50_37-15998611948074914299?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_23_32-3692407341030166885?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_31_08-12703073863407207452?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_39_45-8631060731365589910?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_07_46_46-13114264813154511736?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1185

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1185/display/redirect?page=changes>

Changes:

[grzegorz.kolakowski] [BEAM-3800] Set uids on Flink operators

------------------------------------------
[...truncated 770.56 KB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-26T12:16:08.035824Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_05_16_07-15108954519713376662'
 location: u'us-central1'
 name: u'beamapp-jenkins-0326121557-987020'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_05_16_07-15108954519713376662]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_16_07-15108954519713376662?project=apache-beam-testing
root: INFO: Job 2018-03-26_05_16_07-15108954519713376662 is in state JOB_STATE_PENDING
root: INFO: 2018-03-26T12:16:07.325Z: JOB_MESSAGE_WARNING: Job 2018-03-26_05_16_07-15108954519713376662 might autoscale up to 1000 workers.
root: INFO: 2018-03-26T12:16:07.354Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_05_16_07-15108954519713376662. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-26T12:16:07.382Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_05_16_07-15108954519713376662.
root: INFO: 2018-03-26T12:16:09.707Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-26T12:16:10.011Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-26T12:16:10.930Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T12:16:10.960Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-26T12:16:10.992Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T12:16:11.023Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-26T12:16:11.052Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-26T12:16:11.083Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-26T12:16:11.110Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-26T12:16:11.138Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-26T12:16:11.171Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-26T12:16:11.204Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-26T12:16:11.230Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-26T12:16:11.253Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-26T12:16:11.277Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-26T12:16:11.330Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T12:16:11.357Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T12:16:11.389Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T12:16:11.417Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T12:16:11.439Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-26T12:16:11.469Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-26T12:16:11.491Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-26T12:16:11.525Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-26T12:16:11.541Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-26T12:16:11.564Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-26T12:16:11.590Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-26T12:16:11.621Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-26T12:16:11.650Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-26T12:16:11.677Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-26T12:16:11.810Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-26T12:16:11.875Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-26T12:16:11.901Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-26T12:16:11.913Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-26T12:16:11.944Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-26T12:16:12.018Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-26T12:16:12.065Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-26_05_16_07-15108954519713376662 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-26T12:16:21.261Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T12:16:37.205Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T12:16:53.062Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-26T12:21:33.524Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-26T12:21:33.575Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-26T12:21:33.703Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-26T12:21:33.769Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-26T12:21:39.306Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T12:21:42.726Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T12:21:46.140Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T12:21:49.529Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T12:21:49.588Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-26T12:21:49.619Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032612155-03260516-677b-harness-sncr,
  beamapp-jenkins-032612155-03260516-677b-harness-sncr,
  beamapp-jenkins-032612155-03260516-677b-harness-sncr,
  beamapp-jenkins-032612155-03260516-677b-harness-sncr
root: INFO: 2018-03-26T12:21:49.747Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-26T12:21:49.807Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-26T12:21:49.836Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-26T12:23:00.392Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T12:23:00.458Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-26_05_16_07-15108954519713376662 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1813.213s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_04_53_20-3645921758202149513?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_00_25-2768909414665202670?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_07_26-11808248766510917081?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_09_00-15712462697794286271?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_16_11-5712458703394537300?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_04_53_20-4850924339110161121?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_00_41-3215096050648947218?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_07_46-2306155479031176007?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_04_53_21-12728818209891037826?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_00_41-14335452522418711800?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_08_41-1979926993614589291?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_16_07-15108954519713376662?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_04_53_21-1895996116726409276?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_00_43-8297024254908284136?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_08_33-13909786326520511148?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_05_15_25-1509118882766578244?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user grzegorz.kolakowski@getindata.com
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1184

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1184/display/redirect>

------------------------------------------
[...truncated 776.40 KB...]
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-26T09:24:17.170257Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-26_02_24_16-15350234844942010447'
 location: u'us-central1'
 name: u'beamapp-jenkins-0326092406-492487'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-26_02_24_16-15350234844942010447]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_24_16-15350234844942010447?project=apache-beam-testing
root: INFO: Job 2018-03-26_02_24_16-15350234844942010447 is in state JOB_STATE_PENDING
root: INFO: 2018-03-26T09:24:16.054Z: JOB_MESSAGE_WARNING: Job 2018-03-26_02_24_16-15350234844942010447 might autoscale up to 1000 workers.
root: INFO: 2018-03-26T09:24:16.071Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-26_02_24_16-15350234844942010447. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-26T09:24:16.081Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-26_02_24_16-15350234844942010447.
root: INFO: 2018-03-26T09:24:18.837Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-26T09:24:19.158Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-26T09:24:20.393Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T09:24:20.418Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-26T09:24:20.440Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T09:24:20.453Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-26T09:24:20.468Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-26T09:24:20.498Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-26T09:24:20.526Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-26T09:24:20.550Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-26T09:24:20.582Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-26T09:24:20.611Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-26T09:24:20.637Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-26T09:24:20.667Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-26T09:24:20.685Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-26T09:24:20.713Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T09:24:20.738Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T09:24:20.767Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T09:24:20.785Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T09:24:20.811Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-26T09:24:20.830Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-26T09:24:20.859Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-26T09:24:20.890Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-26T09:24:20.914Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-26T09:24:20.938Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-26T09:24:20.957Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-26T09:24:20.978Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-26T09:24:21.002Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-26T09:24:21.024Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-26T09:24:21.138Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-26T09:24:21.184Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-26T09:24:21.205Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-26T09:24:21.218Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-26T09:24:21.245Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-26T09:24:21.315Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: Job 2018-03-26_02_24_16-15350234844942010447 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-26T09:24:21.372Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-26T09:24:29.777Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T09:24:45.871Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T09:26:28.738Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-26T09:29:49.848Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-26T09:29:49.891Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-26T09:29:49.978Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-26T09:29:50.016Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-26T09:29:55.552Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T09:29:58.944Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T09:30:02.310Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T09:30:05.701Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T09:30:05.743Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-26T09:30:05.764Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032609240-03260224-8d0e-harness-2qw1,
  beamapp-jenkins-032609240-03260224-8d0e-harness-2qw1,
  beamapp-jenkins-032609240-03260224-8d0e-harness-2qw1,
  beamapp-jenkins-032609240-03260224-8d0e-harness-2qw1
root: INFO: 2018-03-26T09:30:05.866Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-26T09:30:05.909Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-26T09:30:05.950Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-26T09:31:30.709Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T09:31:30.760Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-26_02_24_16-15350234844942010447 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1854.263s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_01_43-11872645076944891303?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_08_54-8004245384144699562?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_16_09-10998878763062203229?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_22_57-5314343399759050786?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_01_43-1271280141359156818?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_08_51-13691433977712613857?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_15_43-4879548617388666995?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_24_53-8763286152316600723?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_01_48-12146216890764896284?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_09_49-712022507068901275?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_17_05-17458179108874954132?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_24_16-15350234844942010447?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_01_43-7536710203390756616?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_08_57-13466665747923955452?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_16_58-14208649484815697681?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-26_02_23_58-8828436145765430535?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1183

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1183/display/redirect>

------------------------------------------
[...truncated 778.85 KB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-26T03:23:50.367083Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-25_20_23_49-3084987148932007378'
 location: u'us-central1'
 name: u'beamapp-jenkins-0326032341-171660'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-25_20_23_49-3084987148932007378]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_23_49-3084987148932007378?project=apache-beam-testing
root: INFO: Job 2018-03-25_20_23_49-3084987148932007378 is in state JOB_STATE_PENDING
root: INFO: 2018-03-26T03:23:49.371Z: JOB_MESSAGE_WARNING: Job 2018-03-25_20_23_49-3084987148932007378 might autoscale up to 1000 workers.
root: INFO: 2018-03-26T03:23:49.399Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-25_20_23_49-3084987148932007378. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-26T03:23:49.420Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-25_20_23_49-3084987148932007378.
root: INFO: 2018-03-26T03:23:51.892Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-26T03:23:52.051Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-26T03:23:53.285Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T03:23:53.309Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-26T03:23:53.338Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-26T03:23:53.368Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-26T03:23:53.398Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-26T03:23:53.429Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-26T03:23:53.462Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-26T03:23:53.492Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-26T03:23:53.518Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-26T03:23:53.551Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-26T03:23:53.582Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-26T03:23:53.609Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-26T03:23:53.640Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-26T03:23:53.672Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T03:23:53.695Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T03:23:53.724Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-26T03:23:53.750Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-26T03:23:53.781Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-26T03:23:53.812Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-26T03:23:53.840Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-26T03:23:53.863Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-26T03:23:53.894Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-26T03:23:53.919Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-26T03:23:53.948Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-26T03:23:53.973Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-26T03:23:54.005Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-26T03:23:54.029Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-26T03:23:54.160Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-26T03:23:54.214Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-26T03:23:54.245Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-26T03:23:54.256Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-26T03:23:54.287Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-26T03:23:54.362Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-26T03:23:54.423Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-25_20_23_49-3084987148932007378 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-26T03:24:02.003Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T03:24:17.974Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T03:26:27.787Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-26T03:29:23.831Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-26T03:29:23.893Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-26T03:29:24.026Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-26T03:29:24.069Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-26T03:29:29.575Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T03:29:32.944Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T03:29:36.308Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T03:29:39.722Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-26T03:29:39.764Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-26T03:29:39.796Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032603234-03252023-c629-harness-hf5n,
  beamapp-jenkins-032603234-03252023-c629-harness-hf5n,
  beamapp-jenkins-032603234-03252023-c629-harness-hf5n,
  beamapp-jenkins-032603234-03252023-c629-harness-hf5n
root: INFO: 2018-03-26T03:29:39.892Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-26T03:29:39.935Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-26T03:29:39.959Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-26T03:30:58.989Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-26T03:30:59.021Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-26T03:30:59.055Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-25_20_23_49-3084987148932007378 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1910.230s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_00_50-9723479946904727636?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_07_54-10336324432804536127?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_16_24-18243512168834038830?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_23_49-3084987148932007378?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_00_50-7754627176829098790?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_07_42-14764803785348208354?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_14_58-10930010757500979323?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_22_24-9560589107289317957?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_00_50-7767923472070141216?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_08_30-8463163719927068878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_16_00-17429040202624798126?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_25_05-12590792608612638141?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_00_49-15005998990447423763?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_07_20-2353248477337685062?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_14_15-4266708892953835847?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_20_23_00-9589720662802758968?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1182

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1182/display/redirect>

------------------------------------------
[...truncated 867.34 KB...]
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-25T21:24:57.310881Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-25_14_24_56-4648296128899179550'
 location: u'us-central1'
 name: u'beamapp-jenkins-0325212447-760759'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-25_14_24_56-4648296128899179550]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_24_56-4648296128899179550?project=apache-beam-testing
root: INFO: Job 2018-03-25_14_24_56-4648296128899179550 is in state JOB_STATE_PENDING
root: INFO: 2018-03-25T21:24:56.407Z: JOB_MESSAGE_WARNING: Job 2018-03-25_14_24_56-4648296128899179550 might autoscale up to 1000 workers.
root: INFO: 2018-03-25T21:24:56.426Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-25_14_24_56-4648296128899179550. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-25T21:24:56.447Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-25_14_24_56-4648296128899179550.
root: INFO: 2018-03-25T21:24:59.112Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-25T21:24:59.279Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-25T21:25:00.287Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-25T21:25:00.320Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-25T21:25:00.350Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-25T21:25:00.381Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-25T21:25:00.394Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-25T21:25:00.438Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-25T21:25:00.470Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-25T21:25:00.497Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-25T21:25:00.525Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-25T21:25:00.549Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-25T21:25:00.579Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-25T21:25:00.603Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-25T21:25:00.633Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-25T21:25:00.663Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-25T21:25:00.691Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-25T21:25:00.720Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-25T21:25:00.744Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-25T21:25:00.774Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-25T21:25:00.805Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-25T21:25:00.828Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-25T21:25:00.853Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-25T21:25:00.884Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-25T21:25:00.929Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-25T21:25:00.951Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-25T21:25:00.981Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-25T21:25:01.009Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-25T21:25:01.040Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-25T21:25:01.166Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-25T21:25:01.227Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-25T21:25:01.257Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-25T21:25:01.269Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-25T21:25:01.294Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-25T21:25:01.381Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-25T21:25:01.440Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-25_14_24_56-4648296128899179550 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-25T21:25:11.904Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T21:25:27.918Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T21:26:40.867Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-25T21:30:12.106Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-25T21:30:12.186Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-25T21:30:12.299Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-25T21:30:12.359Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-25T21:30:21.080Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T21:30:24.509Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T21:30:27.875Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T21:30:31.269Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T21:30:31.310Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-25T21:30:31.339Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032521244-03251424-ef6b-harness-scng,
  beamapp-jenkins-032521244-03251424-ef6b-harness-scng,
  beamapp-jenkins-032521244-03251424-ef6b-harness-scng,
  beamapp-jenkins-032521244-03251424-ef6b-harness-scng
root: INFO: 2018-03-25T21:30:31.446Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-25T21:30:31.495Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-25T21:30:31.527Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-25T21:31:44.138Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T21:31:44.236Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-25_14_24_56-4648296128899179550 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1883.022s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_00_46-14112883962680189885?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_08_13-14097781150295332423?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_15_39-7047553628111082246?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_24_59-4185751735537110292?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_00_47-18035240996778807820?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_08_20-10391690085669843670?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_16_40-18235290565308139365?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_23_36-7340815333519927266?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_00_47-10609956709423230437?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_08_32-7677506343115033940?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_15_54-5429013131208565856?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_23_14-12112647300542806593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_00_48-18397869183350371593?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_08_10-17488042051458935185?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_15_20-6104702732391362580?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_14_24_56-4648296128899179550?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1181

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1181/display/redirect>

------------------------------------------
[...truncated 777.91 KB...]
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-25T15:24:45.294465Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-25_08_24_44-18306953762689851222'
 location: u'us-central1'
 name: u'beamapp-jenkins-0325152435-761649'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-25_08_24_44-18306953762689851222]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_24_44-18306953762689851222?project=apache-beam-testing
root: INFO: Job 2018-03-25_08_24_44-18306953762689851222 is in state JOB_STATE_PENDING
root: INFO: 2018-03-25T15:24:44.379Z: JOB_MESSAGE_WARNING: Job 2018-03-25_08_24_44-18306953762689851222 might autoscale up to 1000 workers.
root: INFO: 2018-03-25T15:24:44.407Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-25_08_24_44-18306953762689851222. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-25T15:24:44.434Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-25_08_24_44-18306953762689851222.
root: INFO: 2018-03-25T15:24:47.901Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-25T15:24:48.070Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-25T15:24:48.993Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-25T15:24:49.020Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-25T15:24:49.049Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-25T15:24:49.080Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-25T15:24:49.108Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-25T15:24:49.145Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-25T15:24:49.177Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-25T15:24:49.209Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-25T15:24:49.236Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-25T15:24:49.264Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-25T15:24:49.294Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-25T15:24:49.324Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-25T15:24:49.344Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-25T15:24:49.366Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-25T15:24:49.397Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-25T15:24:49.430Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-25T15:24:49.454Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-25T15:24:49.488Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-25T15:24:49.511Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-25T15:24:49.539Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-25T15:24:49.563Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-25T15:24:49.594Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-25T15:24:49.622Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-25T15:24:49.653Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-25T15:24:49.683Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-25T15:24:49.717Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-25T15:24:49.749Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-25T15:24:49.886Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-25T15:24:49.956Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-25T15:24:49.988Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-25T15:24:49.999Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-25T15:24:50.030Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-25T15:24:50.107Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-25T15:24:50.169Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-25_08_24_44-18306953762689851222 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-25T15:24:59.505Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T15:25:15.551Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T15:25:32.002Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-25T15:29:37.285Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-25T15:29:37.328Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-25T15:29:37.457Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-25T15:29:37.492Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-25T15:29:46.155Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T15:29:49.509Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T15:29:52.878Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T15:29:56.234Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T15:29:56.273Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-25T15:29:56.292Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032515243-03250824-1e89-harness-1rjm,
  beamapp-jenkins-032515243-03250824-1e89-harness-1rjm,
  beamapp-jenkins-032515243-03250824-1e89-harness-1rjm,
  beamapp-jenkins-032515243-03250824-1e89-harness-1rjm
root: INFO: 2018-03-25T15:29:56.377Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-25T15:29:56.422Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-25T15:29:56.440Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-25T15:31:25.950Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T15:31:25.998Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-25_08_24_44-18306953762689851222 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1910.534s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_00_45-12009987359665837831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_08_48-10301436819678024891?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_16_13-17121545538064174004?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_25_23-11346442696336473675?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_00_44-1631257841038448033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_08_39-10495849357826068942?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_16_40-3970471226698640187?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_23_55-15083069122216827394?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_00_46-3434025174637954578?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_08_37-12346563639541373228?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_16_04-18046036162193932700?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_23_25-15430381364294285173?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_00_45-3506683603507678930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_08_17-14080502072249436818?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_15_04-10291882591120974599?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_08_24_44-18306953762689851222?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1180

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1180/display/redirect>

------------------------------------------
[...truncated 779.27 KB...]
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-25T09:24:28.566391Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-25_02_24_27-15700821861519369767'
 location: u'us-central1'
 name: u'beamapp-jenkins-0325092419-379297'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-25_02_24_27-15700821861519369767]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_24_27-15700821861519369767?project=apache-beam-testing
root: INFO: Job 2018-03-25_02_24_27-15700821861519369767 is in state JOB_STATE_PENDING
root: INFO: 2018-03-25T09:24:27.645Z: JOB_MESSAGE_WARNING: Job 2018-03-25_02_24_27-15700821861519369767 might autoscale up to 1000 workers.
root: INFO: 2018-03-25T09:24:27.663Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-25_02_24_27-15700821861519369767. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-25T09:24:27.676Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-25_02_24_27-15700821861519369767.
root: INFO: 2018-03-25T09:24:30.496Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-25T09:24:30.791Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-25T09:24:31.676Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-25T09:24:31.704Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-25T09:24:31.728Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-25T09:24:31.754Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-25T09:24:31.782Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-25T09:24:31.820Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-25T09:24:31.842Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-25T09:24:31.858Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-25T09:24:31.879Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-25T09:24:31.901Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-25T09:24:31.931Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-25T09:24:31.956Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-25T09:24:31.982Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-25T09:24:32.004Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-25T09:24:32.022Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-25T09:24:32.049Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-25T09:24:32.072Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-25T09:24:32.088Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-25T09:24:32.112Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-25T09:24:32.134Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-25T09:24:32.160Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-25T09:24:32.186Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-25T09:24:32.206Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-25T09:24:32.232Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-25T09:24:32.253Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-25T09:24:32.281Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-25T09:24:32.314Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-25T09:24:32.422Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-25T09:24:32.465Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-25T09:24:32.484Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-25T09:24:32.498Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-25T09:24:32.510Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
root: INFO: 2018-03-25T09:24:32.600Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-25T09:24:32.651Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-25_02_24_27-15700821861519369767 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-25T09:24:40.321Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T09:25:04.797Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T09:26:43.455Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-25T09:29:18.083Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-25T09:29:18.128Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-25T09:29:18.221Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-25T09:29:18.267Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-25T09:29:26.965Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T09:29:30.339Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T09:29:33.712Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T09:29:37.081Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T09:29:37.134Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-25T09:29:37.149Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032509241-03250224-a55c-harness-sl2m,
  beamapp-jenkins-032509241-03250224-a55c-harness-sl2m,
  beamapp-jenkins-032509241-03250224-a55c-harness-sl2m,
  beamapp-jenkins-032509241-03250224-a55c-harness-sl2m
root: INFO: 2018-03-25T09:29:37.242Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-25T09:29:37.287Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-25T09:29:37.310Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-25T09:30:36.717Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T09:30:36.767Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-25_02_24_27-15700821861519369767 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1976.584s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_00_45-11386154806426405103?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_07_37-737321854925525581?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_14_36-3343723848690581922?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_25_37-13279680592227264273?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_00_45-16347835746045853476?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_07_46-17148428934526830257?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_16_11-2496031963500258958?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_22_01-7903151057992327616?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_00_46-93709308649177574?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_07_57-4192456689949131519?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_15_27-5419429481090330881?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_24_27-15700821861519369767?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_00_46-11683219341298737998?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_07_42-9022362865070074779?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_14_47-3210715967036012534?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-25_02_22_06-2625757145586033861?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1179

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1179/display/redirect>

------------------------------------------
[...truncated 775.24 KB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-25T03:24:11.351856Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-24_20_24_10-16182077139363237034'
 location: u'us-central1'
 name: u'beamapp-jenkins-0325032401-695917'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-24_20_24_10-16182077139363237034]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_24_10-16182077139363237034?project=apache-beam-testing
root: INFO: Job 2018-03-24_20_24_10-16182077139363237034 is in state JOB_STATE_PENDING
root: INFO: 2018-03-25T03:24:10.359Z: JOB_MESSAGE_WARNING: Job 2018-03-24_20_24_10-16182077139363237034 might autoscale up to 1000 workers.
root: INFO: 2018-03-25T03:24:10.391Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-24_20_24_10-16182077139363237034. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-25T03:24:10.414Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-24_20_24_10-16182077139363237034.
root: INFO: 2018-03-25T03:24:12.987Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-25T03:24:13.153Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-25T03:24:14.418Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-25T03:24:14.454Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-25T03:24:14.485Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-25T03:24:14.508Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-25T03:24:14.533Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-25T03:24:14.565Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-25T03:24:14.589Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-25T03:24:14.620Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-25T03:24:14.643Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-25T03:24:14.666Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-25T03:24:14.699Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-25T03:24:14.722Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-25T03:24:14.754Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-25T03:24:14.785Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-25T03:24:14.805Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-25T03:24:14.836Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-25T03:24:14.864Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-25T03:24:14.895Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-25T03:24:14.927Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-25T03:24:14.955Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-25T03:24:14.988Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-25T03:24:15.010Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-25T03:24:15.042Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-25T03:24:15.075Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-25T03:24:15.096Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-25T03:24:15.121Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-25T03:24:15.154Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-25T03:24:15.295Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-25T03:24:15.335Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-25T03:24:15.367Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-25T03:24:15.379Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-25T03:24:15.402Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-25T03:24:15.485Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: Job 2018-03-24_20_24_10-16182077139363237034 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-25T03:24:15.534Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-25T03:24:23.692Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T03:24:39.769Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T03:26:34.444Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-25T03:29:17.031Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-25T03:29:17.058Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-25T03:29:17.132Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-25T03:29:17.158Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-25T03:29:21.623Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T03:29:25.076Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T03:29:28.447Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T03:29:31.825Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-25T03:29:31.867Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-25T03:29:31.896Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032503240-03242024-0d7f-harness-91zh,
  beamapp-jenkins-032503240-03242024-0d7f-harness-91zh,
  beamapp-jenkins-032503240-03242024-0d7f-harness-91zh,
  beamapp-jenkins-032503240-03242024-0d7f-harness-91zh
root: INFO: 2018-03-25T03:29:31.995Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-25T03:29:32.039Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-25T03:29:32.070Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-25T03:30:43.033Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-25T03:30:43.071Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-25T03:30:43.102Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-24_20_24_10-16182077139363237034 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1852.157s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_00_46-802248415589347953?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_08_23-14521565146244481406?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_15_35-12045176298827915965?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_24_10-16182077139363237034?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_00_46-15932091811513432381?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_08_40-13940362691222491522?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_16_51-480543310051665446?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_24_26-3157979699350060345?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_00_47-14489595957379940765?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_08_14-4032358014870025629?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_15_36-11640951111217372216?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_22_07-3393201885241070664?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_00_47-14728937515731703102?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_08_23-3071581958425272525?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_16_48-13034458681851607014?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_20_23_19-13767877595648138120?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1178

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1178/display/redirect>

------------------------------------------
[...truncated 779.18 KB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-24T21:24:53.019643Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-24_14_24_51-5163007450373142262'
 location: u'us-central1'
 name: u'beamapp-jenkins-0324212443-985988'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-24_14_24_51-5163007450373142262]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_24_51-5163007450373142262?project=apache-beam-testing
root: INFO: Job 2018-03-24_14_24_51-5163007450373142262 is in state JOB_STATE_PENDING
root: INFO: 2018-03-24T21:24:52.006Z: JOB_MESSAGE_WARNING: Job 2018-03-24_14_24_51-5163007450373142262 might autoscale up to 1000 workers.
root: INFO: 2018-03-24T21:24:52.025Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-03-24_14_24_51-5163007450373142262. The number of workers will be between 1 and 1000.
root: INFO: 2018-03-24T21:24:52.047Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-03-24_14_24_51-5163007450373142262.
root: INFO: 2018-03-24T21:24:54.850Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are enabled.
root: INFO: 2018-03-24T21:24:55.144Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-03-24T21:24:56.225Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-24T21:24:56.260Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-24T21:24:56.288Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-03-24T21:24:56.321Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-24T21:24:56.354Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-24T21:24:56.394Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-24T21:24:56.426Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.out
root: INFO: 2018-03-24T21:24:56.448Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-24T21:24:56.479Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-24T21:24:56.510Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-24T21:24:56.537Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-24T21:24:56.567Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-24T21:24:56.598Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-24T21:24:56.628Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-24T21:24:56.653Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-24T21:24:56.677Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-24T21:24:56.700Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-24T21:24:56.721Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-24T21:24:56.752Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-24T21:24:56.776Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute into start/Read
root: INFO: 2018-03-24T21:24:56.807Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-24T21:24:56.837Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-24T21:24:56.866Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-24T21:24:56.896Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-03-24T21:24:56.927Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-03-24T21:24:56.969Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-24T21:24:56.997Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-24T21:24:57.115Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-24T21:24:57.173Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: Job 2018-03-24_14_24_51-5163007450373142262 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-24T21:24:57.208Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-24T21:24:57.220Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-24T21:24:57.251Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-24T21:24:57.335Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-24T21:24:57.392Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-24T21:25:04.969Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-24T21:25:21.016Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-24T21:26:44.268Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-24T21:30:02.312Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-24T21:30:02.374Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-24T21:30:02.533Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-24T21:30:02.611Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-24T21:30:08.188Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-24T21:30:11.578Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-24T21:30:14.948Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-24T21:30:18.306Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line 62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-24T21:30:18.350Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-24T21:30:18.381Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032421244-03241424-b9bf-harness-ts92,
  beamapp-jenkins-032421244-03241424-b9bf-harness-ts92,
  beamapp-jenkins-032421244-03241424-b9bf-harness-ts92,
  beamapp-jenkins-032421244-03241424-b9bf-harness-ts92
root: INFO: 2018-03-24T21:30:18.484Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-24T21:30:18.531Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-24T21:30:18.554Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-24T21:31:36.469Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-24T21:31:36.502Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-24T21:31:36.541Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-24_14_24_51-5163007450373142262 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 2041.045s

FAILED (errors=9)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_00_42-9112215217226842691?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_08_12-18083876752052807208?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_18_47-17712894071539371165?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_26_37-12108562193600054259?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_00_42-4062278591180194116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_08_06-4213171757650998116?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_15_07-16541090822654230911?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_24_51-5163007450373142262?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_00_44-17402192440261633561?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_08_26-13584060711071725539?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_15_55-5677324123084318269?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_23_40-15592817197832922088?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_00_43-11632894380409270909?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_08_07-15350138640753310709?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_15_22-10852082099417832016?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-24_14_24_32-4122970132700239562?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user szewinho@gmail.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user jb@nanthrax.net
Not sending mail to unregistered user mariand@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com