You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/10/18 20:53:04 UTC

Build failed in Jenkins: beam_PostCommit_PythonVerify #565

See <https://builds.apache.org/job/beam_PostCommit_PythonVerify/565/changes>

Changes:

[robertwb] Windowed side input test.

[robertwb] Implement windowed side inputs for direct runner.

[robertwb] Fix tests expecting list from AsIter.

[robertwb] Implement windowed side inputs for InProcess runner.

[robertwb] More complicated window tests.

[robertwb] Optimize globally windowed side input case

[robertwb] Minor fixups for better testing

[robertwb] Rename from_iterable to avoid confusion.

------------------------------------------
[...truncated 2889 lines...]
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "TimestampCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlwhmbmpxSWJuQXOID5XIYNmYyFjbSFTkh4ANWETWg==", 
                  "component_encodings": []
                }, 
                {
                  "@type": "SingletonCoder$<string of 252 bytes>", 
                  "component_encodings": []
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/WriteImpl/finalize_write.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 1292 bytes>", 
        "user_name": "write/WriteImpl/finalize_write"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
INFO:root:Create job: <Job
 id: u'2016-10-18_13_48_20-13116714018908792973'
 projectId: u'apache-beam-testing'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:root:Created job with id: [2016-10-18_13_48_20-13116714018908792973]
INFO:root:To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2016-10-18_13_48_20-13116714018908792973
INFO:root:Job 2016-10-18_13_48_20-13116714018908792973 is in state JOB_STATE_RUNNING
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d55e6: 2016-10-18T20:48:20.966Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade31d): Checking required Cloud APIs are enabled.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5700: 2016-10-18T20:48:21.248Z: JOB_MESSAGE_DEBUG: (cc0f1c724dadedf5): Combiner lifting skipped for step write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5702: 2016-10-18T20:48:21.250Z: JOB_MESSAGE_DEBUG: (cc0f1c724dade8ab): Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5704: 2016-10-18T20:48:21.252Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade361): Expanding GroupByKey operations into optimizable parts.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5707: 2016-10-18T20:48:21.255Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee17): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d570e: 2016-10-18T20:48:21.262Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee39): Annotating graph with Autotuner information.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d573d: 2016-10-18T20:48:21.309Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade911): Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5740: 2016-10-18T20:48:21.312Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade3c7): Fusing consumer split into read
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5743: 2016-10-18T20:48:21.315Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee7d): Fusing consumer group/Reify into pair_with_one
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5746: 2016-10-18T20:48:21.318Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade933): Fusing consumer format into count
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5749: 2016-10-18T20:48:21.321Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade3e9): Fusing consumer write/WriteImpl/GroupByKey/GroupByWindow into write/WriteImpl/GroupByKey/Read
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d574b: 2016-10-18T20:48:21.323Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee9f): Fusing consumer write/WriteImpl/GroupByKey/Write into write/WriteImpl/GroupByKey/Reify
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5752: 2016-10-18T20:48:21.330Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade40b): Fusing consumer write/WriteImpl/FlatMap(<lambda at iobase.py:758>) into write/WriteImpl/GroupByKey/GroupByWindow
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5755: 2016-10-18T20:48:21.333Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeec1): Fusing consumer count into group/GroupByWindow
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5758: 2016-10-18T20:48:21.336Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade977): Fusing consumer write/WriteImpl/WindowInto into write/WriteImpl/Map(<lambda at iobase.py:755>)
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d575b: 2016-10-18T20:48:21.339Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade42d): Fusing consumer write/WriteImpl/GroupByKey/Reify into write/WriteImpl/WindowInto
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d575d: 2016-10-18T20:48:21.341Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeee3): Fusing consumer write/WriteImpl/Map(<lambda at iobase.py:755>) into write/WriteImpl/write_bundles
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5760: 2016-10-18T20:48:21.344Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade999): Fusing consumer pair_with_one into split
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5762: 2016-10-18T20:48:21.346Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade44f): Fusing consumer group/GroupByWindow into group/Read
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5765: 2016-10-18T20:48:21.349Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadef05): Fusing consumer write/WriteImpl/write_bundles into format
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d576a: 2016-10-18T20:48:21.354Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade9bb): Fusing consumer group/Write into group/Reify
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57c1: 2016-10-18T20:48:21.441Z: JOB_MESSAGE_DEBUG: (cc0f1c724dade07b): Workflow config is missing a default resource spec.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57c3: 2016-10-18T20:48:21.443Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeb31): Adding StepResource setup and teardown to workflow graph.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57f8: 2016-10-18T20:48:21.496Z: JOB_MESSAGE_DEBUG: (f4efa5dde1605941): Adding workflow start and stop steps.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5802: 2016-10-18T20:48:21.506Z: JOB_MESSAGE_DEBUG: (19e2a43502bd1fce): Assigning stage ids.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5865: 2016-10-18T20:48:21.605Z: JOB_MESSAGE_DEBUG: (16a2eef936374c8e): Executing wait step start2
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5870: 2016-10-18T20:48:21.616Z: JOB_MESSAGE_DEBUG: (16a2eef936374351): Executing operation write/WriteImpl/DoOnce
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5879: 2016-10-18T20:48:21.625Z: JOB_MESSAGE_BASIC: S01: (16a2eef93637483b): Executing operation group/Create
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d587d: 2016-10-18T20:48:21.629Z: JOB_MESSAGE_DEBUG: (b9ce855eb405fb82): Value "write/WriteImpl/DoOnce.out" materialized.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5895: 2016-10-18T20:48:21.653Z: JOB_MESSAGE_BASIC: S04: (cc0f1c724dade09d): Executing operation write/WriteImpl/initialize_write
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5944: 2016-10-18T20:48:21.828Z: JOB_MESSAGE_DEBUG: (3cceca373b352626): Starting worker pool setup.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5946: 2016-10-18T20:48:21.830Z: JOB_MESSAGE_BASIC: (3cceca373b352a50): Starting 1 workers...
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5959: 2016-10-18T20:48:21.849Z: JOB_MESSAGE_DEBUG: (decbc9edbb3cc0f3): Value "group/Session" materialized.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5964: 2016-10-18T20:48:21.860Z: JOB_MESSAGE_BASIC: S02: (37d85b62a91607ee): Executing operation read+split+pair_with_one+group/Reify+group/Write
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98f0022: 2016-10-18T20:50:10.082Z: JOB_MESSAGE_DETAILED: (9b51a8a70519ded7): Workers have started successfully.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbebc: 2016-10-18T20:50:58.876Z: JOB_MESSAGE_DEBUG: (decbc9edbb3cc8a0): Value "write/WriteImpl/initialize_write.out" materialized.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbec7: 2016-10-18T20:50:58.887Z: JOB_MESSAGE_BASIC: S05: (f4efa5dde1605ffb): Executing operation write/WriteImpl/ViewAsSingleton(write|WriteImpl|initialize_write.None)/CreatePCollectionView
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbf21: 2016-10-18T20:50:58.977Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd6fa8): Value "write/WriteImpl/ViewAsSingleton(write|WriteImpl|initialize_write.None)/CreatePCollectionView.out" materialized.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc5a8: 2016-10-18T20:51:00.648Z: JOB_MESSAGE_BASIC: S03: (b9ce855eb405f56a): Executing operation group/Close
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc5c1: 2016-10-18T20:51:00.673Z: JOB_MESSAGE_BASIC: S06: (37d85b62a916051e): Executing operation write/WriteImpl/GroupByKey/Create
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc671: 2016-10-18T20:51:00.849Z: JOB_MESSAGE_DEBUG: (16a2eef9363745c1): Value "write/WriteImpl/GroupByKey/Session" materialized.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc683: 2016-10-18T20:51:00.867Z: JOB_MESSAGE_BASIC: S07: (848e4970f9b90b1f): Executing operation group/Read+group/GroupByWindow+count+format+write/WriteImpl/write_bundles+write/WriteImpl/Map(<lambda at iobase.py:755>)+write/WriteImpl/WindowInto+write/WriteImpl/GroupByKey/Reify+write/WriteImpl/GroupByKey/Write
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc847: 2016-10-18T20:51:01.319Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4db6): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc8ff: 2016-10-18T20:51:01.503Z: JOB_MESSAGE_ERROR: (f57f9314dc1c434d): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc9b0: 2016-10-18T20:51:01.680Z: JOB_MESSAGE_ERROR: (f57f9314dc1c48e4): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fca71: 2016-10-18T20:51:01.873Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4e7b): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcb5d: 2016-10-18T20:51:02.109Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4412): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcc0e: 2016-10-18T20:51:02.286Z: JOB_MESSAGE_ERROR: (f57f9314dc1c49a9): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcced: 2016-10-18T20:51:02.509Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4f40): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcd93: 2016-10-18T20:51:02.675Z: JOB_MESSAGE_ERROR: (f57f9314dc1c44d7): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdbc: 2016-10-18T20:51:02.716Z: JOB_MESSAGE_DEBUG: (b8f75ec1d16f6d1e): Executing failure step failure1
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdbe: 2016-10-18T20:51:02.718Z: JOB_MESSAGE_ERROR: (b8f75ec1d16f6950): Workflow failed. Causes: (848e4970f9b90f7d): S07:group/Read+group/GroupByWindow+count+format+write/WriteImpl/write_bundles+write/WriteImpl/Map(<lambda at iobase.py:755>)+write/WriteImpl/WindowInto+write/WriteImpl/GroupByKey/Reify+write/WriteImpl/GroupByKey/Write failed.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdf2: 2016-10-18T20:51:02.770Z: JOB_MESSAGE_DETAILED: (41f77b7ab7cd6989): Cleaning up.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fce79: 2016-10-18T20:51:02.905Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd6cd4): Starting worker pool teardown.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fce7c: 2016-10-18T20:51:02.908Z: JOB_MESSAGE_BASIC: (41f77b7ab7cd6f06): Stopping worker pool...
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d99168ad: 2016-10-18T20:52:47.917Z: JOB_MESSAGE_BASIC: (41f77b7ab7cd6251): Worker pool stopped.
INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d9916cab: 2016-10-18T20:52:48.939Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd67ce): Tearing down pending resources...
INFO:root:Job 2016-10-18_13_48_20-13116714018908792973 is in state JOB_STATE_FAILED
Traceback (most recent call last):
  File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
    exec code in run_globals
  File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",> line 107, in <module>
    run()
  File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",> line 98, in run
    result = p.run()
  File "apache_beam/pipeline.py", line 159, in run
    return self.runner.run(self)
  File "apache_beam/runners/dataflow_runner.py", line 188, in run
    % getattr(self, 'last_error_msg', None), self.result)
apache_beam.runners.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed:
(f57f9314dc1c44d7): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack


# Grep will exit with status 1 if success message was not found.
echo ">>> CHECKING JOB SUCCESS"
>>> CHECKING JOB SUCCESS
grep JOB_STATE_DONE job_output
Build step 'Execute shell' marked build as failure

Re: Build failed in Jenkins: beam_PostCommit_PythonVerify #565

Posted by Robert Bradshaw <ro...@google.com>.
I'm looking into this.

On Tue, Oct 18, 2016 at 1:53 PM, Apache Jenkins Server
<je...@builds.apache.org> wrote:
> See <https://builds.apache.org/job/beam_PostCommit_PythonVerify/565/changes>
>
> Changes:
>
> [robertwb] Windowed side input test.
>
> [robertwb] Implement windowed side inputs for direct runner.
>
> [robertwb] Fix tests expecting list from AsIter.
>
> [robertwb] Implement windowed side inputs for InProcess runner.
>
> [robertwb] More complicated window tests.
>
> [robertwb] Optimize globally windowed side input case
>
> [robertwb] Minor fixups for better testing
>
> [robertwb] Rename from_iterable to avoid confusion.
>
> ------------------------------------------
> [...truncated 2889 lines...]
>                     },
>                     {
>                       "@type": "FastPrimitivesCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
>                       "component_encodings": []
>                     }
>                   ],
>                   "is_pair_like": true
>                 },
>                 {
>                   "@type": "TimestampCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlwhmbmpxSWJuQXOID5XIYNmYyFjbSFTkh4ANWETWg==",
>                   "component_encodings": []
>                 },
>                 {
>                   "@type": "SingletonCoder$<string of 252 bytes>",
>                   "component_encodings": []
>                 }
>               ],
>               "is_wrapper": true
>             },
>             "output_name": "out",
>             "user_name": "write/WriteImpl/finalize_write.out"
>           }
>         ],
>         "parallel_input": {
>           "@type": "OutputReference",
>           "output_name": "out",
>           "step_name": "s7"
>         },
>         "serialized_fn": "<string of 1292 bytes>",
>         "user_name": "write/WriteImpl/finalize_write"
>       }
>     }
>   ],
>   "type": "JOB_TYPE_BATCH"
> }
> INFO:root:Create job: <Job
>  id: u'2016-10-18_13_48_20-13116714018908792973'
>  projectId: u'apache-beam-testing'
>  steps: []
>  tempFiles: []
>  type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
> INFO:root:Created job with id: [2016-10-18_13_48_20-13116714018908792973]
> INFO:root:To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2016-10-18_13_48_20-13116714018908792973
> INFO:root:Job 2016-10-18_13_48_20-13116714018908792973 is in state JOB_STATE_RUNNING
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d55e6: 2016-10-18T20:48:20.966Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade31d): Checking required Cloud APIs are enabled.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5700: 2016-10-18T20:48:21.248Z: JOB_MESSAGE_DEBUG: (cc0f1c724dadedf5): Combiner lifting skipped for step write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5702: 2016-10-18T20:48:21.250Z: JOB_MESSAGE_DEBUG: (cc0f1c724dade8ab): Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5704: 2016-10-18T20:48:21.252Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade361): Expanding GroupByKey operations into optimizable parts.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5707: 2016-10-18T20:48:21.255Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee17): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d570e: 2016-10-18T20:48:21.262Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee39): Annotating graph with Autotuner information.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d573d: 2016-10-18T20:48:21.309Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade911): Fusing adjacent ParDo, Read, Write, and Flatten operations
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5740: 2016-10-18T20:48:21.312Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade3c7): Fusing consumer split into read
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5743: 2016-10-18T20:48:21.315Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee7d): Fusing consumer group/Reify into pair_with_one
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5746: 2016-10-18T20:48:21.318Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade933): Fusing consumer format into count
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5749: 2016-10-18T20:48:21.321Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade3e9): Fusing consumer write/WriteImpl/GroupByKey/GroupByWindow into write/WriteImpl/GroupByKey/Read
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d574b: 2016-10-18T20:48:21.323Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadee9f): Fusing consumer write/WriteImpl/GroupByKey/Write into write/WriteImpl/GroupByKey/Reify
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5752: 2016-10-18T20:48:21.330Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade40b): Fusing consumer write/WriteImpl/FlatMap(<lambda at iobase.py:758>) into write/WriteImpl/GroupByKey/GroupByWindow
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5755: 2016-10-18T20:48:21.333Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeec1): Fusing consumer count into group/GroupByWindow
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5758: 2016-10-18T20:48:21.336Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade977): Fusing consumer write/WriteImpl/WindowInto into write/WriteImpl/Map(<lambda at iobase.py:755>)
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d575b: 2016-10-18T20:48:21.339Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade42d): Fusing consumer write/WriteImpl/GroupByKey/Reify into write/WriteImpl/WindowInto
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d575d: 2016-10-18T20:48:21.341Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeee3): Fusing consumer write/WriteImpl/Map(<lambda at iobase.py:755>) into write/WriteImpl/write_bundles
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5760: 2016-10-18T20:48:21.344Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade999): Fusing consumer pair_with_one into split
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5762: 2016-10-18T20:48:21.346Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade44f): Fusing consumer group/GroupByWindow into group/Read
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5765: 2016-10-18T20:48:21.349Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadef05): Fusing consumer write/WriteImpl/write_bundles into format
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d576a: 2016-10-18T20:48:21.354Z: JOB_MESSAGE_DETAILED: (cc0f1c724dade9bb): Fusing consumer group/Write into group/Reify
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57c1: 2016-10-18T20:48:21.441Z: JOB_MESSAGE_DEBUG: (cc0f1c724dade07b): Workflow config is missing a default resource spec.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57c3: 2016-10-18T20:48:21.443Z: JOB_MESSAGE_DETAILED: (cc0f1c724dadeb31): Adding StepResource setup and teardown to workflow graph.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d57f8: 2016-10-18T20:48:21.496Z: JOB_MESSAGE_DEBUG: (f4efa5dde1605941): Adding workflow start and stop steps.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5802: 2016-10-18T20:48:21.506Z: JOB_MESSAGE_DEBUG: (19e2a43502bd1fce): Assigning stage ids.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5865: 2016-10-18T20:48:21.605Z: JOB_MESSAGE_DEBUG: (16a2eef936374c8e): Executing wait step start2
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5870: 2016-10-18T20:48:21.616Z: JOB_MESSAGE_DEBUG: (16a2eef936374351): Executing operation write/WriteImpl/DoOnce
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5879: 2016-10-18T20:48:21.625Z: JOB_MESSAGE_BASIC: S01: (16a2eef93637483b): Executing operation group/Create
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d587d: 2016-10-18T20:48:21.629Z: JOB_MESSAGE_DEBUG: (b9ce855eb405fb82): Value "write/WriteImpl/DoOnce.out" materialized.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5895: 2016-10-18T20:48:21.653Z: JOB_MESSAGE_BASIC: S04: (cc0f1c724dade09d): Executing operation write/WriteImpl/initialize_write
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5944: 2016-10-18T20:48:21.828Z: JOB_MESSAGE_DEBUG: (3cceca373b352626): Starting worker pool setup.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5946: 2016-10-18T20:48:21.830Z: JOB_MESSAGE_BASIC: (3cceca373b352a50): Starting 1 workers...
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5959: 2016-10-18T20:48:21.849Z: JOB_MESSAGE_DEBUG: (decbc9edbb3cc0f3): Value "group/Session" materialized.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98d5964: 2016-10-18T20:48:21.860Z: JOB_MESSAGE_BASIC: S02: (37d85b62a91607ee): Executing operation read+split+pair_with_one+group/Reify+group/Write
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98f0022: 2016-10-18T20:50:10.082Z: JOB_MESSAGE_DETAILED: (9b51a8a70519ded7): Workers have started successfully.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbebc: 2016-10-18T20:50:58.876Z: JOB_MESSAGE_DEBUG: (decbc9edbb3cc8a0): Value "write/WriteImpl/initialize_write.out" materialized.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbec7: 2016-10-18T20:50:58.887Z: JOB_MESSAGE_BASIC: S05: (f4efa5dde1605ffb): Executing operation write/WriteImpl/ViewAsSingleton(write|WriteImpl|initialize_write.None)/CreatePCollectionView
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fbf21: 2016-10-18T20:50:58.977Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd6fa8): Value "write/WriteImpl/ViewAsSingleton(write|WriteImpl|initialize_write.None)/CreatePCollectionView.out" materialized.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc5a8: 2016-10-18T20:51:00.648Z: JOB_MESSAGE_BASIC: S03: (b9ce855eb405f56a): Executing operation group/Close
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc5c1: 2016-10-18T20:51:00.673Z: JOB_MESSAGE_BASIC: S06: (37d85b62a916051e): Executing operation write/WriteImpl/GroupByKey/Create
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc671: 2016-10-18T20:51:00.849Z: JOB_MESSAGE_DEBUG: (16a2eef9363745c1): Value "write/WriteImpl/GroupByKey/Session" materialized.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc683: 2016-10-18T20:51:00.867Z: JOB_MESSAGE_BASIC: S07: (848e4970f9b90b1f): Executing operation group/Read+group/GroupByWindow+count+format+write/WriteImpl/write_bundles+write/WriteImpl/Map(<lambda at iobase.py:755>)+write/WriteImpl/WindowInto+write/WriteImpl/GroupByKey/Reify+write/WriteImpl/GroupByKey/Write
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc847: 2016-10-18T20:51:01.319Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4db6): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc8ff: 2016-10-18T20:51:01.503Z: JOB_MESSAGE_ERROR: (f57f9314dc1c434d): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fc9b0: 2016-10-18T20:51:01.680Z: JOB_MESSAGE_ERROR: (f57f9314dc1c48e4): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fca71: 2016-10-18T20:51:01.873Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4e7b): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcb5d: 2016-10-18T20:51:02.109Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4412): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcc0e: 2016-10-18T20:51:02.286Z: JOB_MESSAGE_ERROR: (f57f9314dc1c49a9): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcced: 2016-10-18T20:51:02.509Z: JOB_MESSAGE_ERROR: (f57f9314dc1c4f40): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcd93: 2016-10-18T20:51:02.675Z: JOB_MESSAGE_ERROR: (f57f9314dc1c44d7): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdbc: 2016-10-18T20:51:02.716Z: JOB_MESSAGE_DEBUG: (b8f75ec1d16f6d1e): Executing failure step failure1
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdbe: 2016-10-18T20:51:02.718Z: JOB_MESSAGE_ERROR: (b8f75ec1d16f6950): Workflow failed. Causes: (848e4970f9b90f7d): S07:group/Read+group/GroupByWindow+count+format+write/WriteImpl/write_bundles+write/WriteImpl/Map(<lambda at iobase.py:755>)+write/WriteImpl/WindowInto+write/WriteImpl/GroupByKey/Reify+write/WriteImpl/GroupByKey/Write failed.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fcdf2: 2016-10-18T20:51:02.770Z: JOB_MESSAGE_DETAILED: (41f77b7ab7cd6989): Cleaning up.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fce79: 2016-10-18T20:51:02.905Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd6cd4): Starting worker pool teardown.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d98fce7c: 2016-10-18T20:51:02.908Z: JOB_MESSAGE_BASIC: (41f77b7ab7cd6f06): Stopping worker pool...
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d99168ad: 2016-10-18T20:52:47.917Z: JOB_MESSAGE_BASIC: (41f77b7ab7cd6251): Worker pool stopped.
> INFO:root:2016-10-18_13_48_20-13116714018908792973_00000157d9916cab: 2016-10-18T20:52:48.939Z: JOB_MESSAGE_DEBUG: (41f77b7ab7cd67ce): Tearing down pending resources...
> INFO:root:Job 2016-10-18_13_48_20-13116714018908792973 is in state JOB_STATE_FAILED
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
>     "__main__", fname, loader, pkg_name)
>   File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
>     exec code in run_globals
>   File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",> line 107, in <module>
>     run()
>   File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",> line 98, in run
>     result = p.run()
>   File "apache_beam/pipeline.py", line 159, in run
>     return self.runner.run(self)
>   File "apache_beam/runners/dataflow_runner.py", line 188, in run
>     % getattr(self, 'last_error_msg', None), self.result)
> apache_beam.runners.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed:
> (f57f9314dc1c44d7): Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
>     work_executor.execute()
>   File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
>     op.start()
>   File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
>     def start(self):
>   File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
>     self.dofn_runner = common.DoFnRunner(
>   File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
>     for side_input in side_inputs]
>   File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
>     has_default, default = view_options
> ValueError: need more than 1 value to unpack
>
>
> # Grep will exit with status 1 if success message was not found.
> echo ">>> CHECKING JOB SUCCESS"
>>>> CHECKING JOB SUCCESS
> grep JOB_STATE_DONE job_output
> Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PostCommit_PythonVerify #567

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_PythonVerify/567/>


Build failed in Jenkins: beam_PostCommit_PythonVerify #566

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_PythonVerify/566/>

------------------------------------------
[...truncated 2850 lines...]
                    }
                  ], 
                  "is_wrapper": true
                }
              ]
            }, 
            "output_name": "out", 
            "user_name": "write/WriteImpl/ViewAsIterable(write|WriteImpl|FlatMap(<lambda at iobase.py:758>).None)/CreatePCollectionView.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s14"
        }, 
        "user_name": "write/WriteImpl/ViewAsIterable(write|WriteImpl|FlatMap(<lambda at iobase.py:758>).None)/CreatePCollectionView"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s16", 
      "properties": {
        "non_parallel_inputs": {
          "s15": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "s15"
          }, 
          "s9": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "s9"
          }
        }, 
        "output_info": [
          {
            "encoding": {
              "@type": "WindowedValueCoder$<string of 408 bytes>", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "TimestampCoder$eJxrYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlwhmbmpxSWJuQXOID5XIYNmYyFjbSFTkh4ANWETWg==", 
                  "component_encodings": []
                }, 
                {
                  "@type": "SingletonCoder$<string of 252 bytes>", 
                  "component_encodings": []
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/WriteImpl/finalize_write.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 1292 bytes>", 
        "user_name": "write/WriteImpl/finalize_write"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
INFO:root:Create job: <Job
 id: u'2016-10-18_14_04_17-7862377147606629765'
 projectId: u'apache-beam-testing'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:root:Created job with id: [2016-10-18_14_04_17-7862377147606629765]
INFO:root:To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2016-10-18_14_04_17-7862377147606629765
INFO:root:Job 2016-10-18_14_04_17-7862377147606629765 is in state JOB_STATE_RUNNING
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf356: 2016-10-18T21:04:18.774Z: JOB_MESSAGE_DETAILED: (2787e482e4005483): Checking required Cloud APIs are enabled.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf467: 2016-10-18T21:04:19.047Z: JOB_MESSAGE_DEBUG: (2787e482e4005f84): Combiner lifting skipped for step write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf46a: 2016-10-18T21:04:19.050Z: JOB_MESSAGE_DEBUG: (2787e482e40051f6): Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf46c: 2016-10-18T21:04:19.052Z: JOB_MESSAGE_DETAILED: (2787e482e4005468): Expanding GroupByKey operations into optimizable parts.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf46f: 2016-10-18T21:04:19.055Z: JOB_MESSAGE_DETAILED: (2787e482e40056da): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf476: 2016-10-18T21:04:19.062Z: JOB_MESSAGE_DETAILED: (2787e482e4005e30): Annotating graph with Autotuner information.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf497: 2016-10-18T21:04:19.095Z: JOB_MESSAGE_DETAILED: (2787e482e40057f8): Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf49b: 2016-10-18T21:04:19.099Z: JOB_MESSAGE_DETAILED: (2787e482e4005a6a): Fusing consumer split into read
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf49d: 2016-10-18T21:04:19.101Z: JOB_MESSAGE_DETAILED: (2787e482e4005cdc): Fusing consumer group/Reify into pair_with_one
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4a0: 2016-10-18T21:04:19.104Z: JOB_MESSAGE_DETAILED: (2787e482e4005f4e): Fusing consumer format into count
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4a2: 2016-10-18T21:04:19.106Z: JOB_MESSAGE_DETAILED: (2787e482e40051c0): Fusing consumer write/WriteImpl/GroupByKey/GroupByWindow into write/WriteImpl/GroupByKey/Read
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4a4: 2016-10-18T21:04:19.108Z: JOB_MESSAGE_DETAILED: (2787e482e4005432): Fusing consumer write/WriteImpl/GroupByKey/Write into write/WriteImpl/GroupByKey/Reify
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4a9: 2016-10-18T21:04:19.113Z: JOB_MESSAGE_DETAILED: (2787e482e4005916): Fusing consumer write/WriteImpl/FlatMap(<lambda at iobase.py:758>) into write/WriteImpl/GroupByKey/GroupByWindow
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4b0: 2016-10-18T21:04:19.120Z: JOB_MESSAGE_DETAILED: (2787e482e4005b88): Fusing consumer count into group/GroupByWindow
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4b2: 2016-10-18T21:04:19.122Z: JOB_MESSAGE_DETAILED: (2787e482e4005dfa): Fusing consumer write/WriteImpl/WindowInto into write/WriteImpl/Map(<lambda at iobase.py:755>)
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4b5: 2016-10-18T21:04:19.125Z: JOB_MESSAGE_DETAILED: (2787e482e400506c): Fusing consumer write/WriteImpl/GroupByKey/Reify into write/WriteImpl/WindowInto
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4b7: 2016-10-18T21:04:19.127Z: JOB_MESSAGE_DETAILED: (2787e482e40052de): Fusing consumer write/WriteImpl/Map(<lambda at iobase.py:755>) into write/WriteImpl/write_bundles
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4b9: 2016-10-18T21:04:19.129Z: JOB_MESSAGE_DETAILED: (2787e482e4005550): Fusing consumer pair_with_one into split
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4bc: 2016-10-18T21:04:19.132Z: JOB_MESSAGE_DETAILED: (2787e482e40057c2): Fusing consumer group/GroupByWindow into group/Read
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4be: 2016-10-18T21:04:19.134Z: JOB_MESSAGE_DETAILED: (2787e482e4005a34): Fusing consumer write/WriteImpl/write_bundles into format
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf4c0: 2016-10-18T21:04:19.136Z: JOB_MESSAGE_DETAILED: (2787e482e4005ca6): Fusing consumer group/Write into group/Reify
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf505: 2016-10-18T21:04:19.205Z: JOB_MESSAGE_DEBUG: (2787e482e4005ae6): Workflow config is missing a default resource spec.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf507: 2016-10-18T21:04:19.207Z: JOB_MESSAGE_DETAILED: (2787e482e4005d58): Adding StepResource setup and teardown to workflow graph.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf524: 2016-10-18T21:04:19.236Z: JOB_MESSAGE_DEBUG: (6ea0de46e224f918): Adding workflow start and stop steps.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf550: 2016-10-18T21:04:19.280Z: JOB_MESSAGE_DEBUG: (afad86e2b9352ea0): Assigning stage ids.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf5be: 2016-10-18T21:04:19.390Z: JOB_MESSAGE_DEBUG: (6ea0de46e224f987): Executing wait step start2
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf5c8: 2016-10-18T21:04:19.400Z: JOB_MESSAGE_BASIC: S01: (2de4ec76a3165b6d): Executing operation group/Create
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf5cb: 2016-10-18T21:04:19.403Z: JOB_MESSAGE_DEBUG: (8491705b90a34fbd): Executing operation write/WriteImpl/DoOnce
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf5e5: 2016-10-18T21:04:19.429Z: JOB_MESSAGE_DEBUG: (82f4625d69dc6081): Value "write/WriteImpl/DoOnce.out" materialized.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf5f0: 2016-10-18T21:04:19.440Z: JOB_MESSAGE_BASIC: S04: (5f34236610ccc0e1): Executing operation write/WriteImpl/initialize_write
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf694: 2016-10-18T21:04:19.604Z: JOB_MESSAGE_DEBUG: (ef48b2d9a419ccf0): Starting worker pool setup.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf696: 2016-10-18T21:04:19.606Z: JOB_MESSAGE_BASIC: (ef48b2d9a419ca3a): Starting 1 workers...
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf6aa: 2016-10-18T21:04:19.626Z: JOB_MESSAGE_DEBUG: (28ca00b690721d4d): Value "group/Session" materialized.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99bf6e0: 2016-10-18T21:04:19.680Z: JOB_MESSAGE_BASIC: S02: (4e0f72cd0696e1d4): Executing operation read+split+pair_with_one+group/Reify+group/Write
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99d7f14: 2016-10-18T21:06:00.084Z: JOB_MESSAGE_DETAILED: (474c745854a5aa85): Workers have started successfully.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e3f60: 2016-10-18T21:06:49.312Z: JOB_MESSAGE_DEBUG: (2787e482e40054ae): Value "write/WriteImpl/initialize_write.out" materialized.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e3f6a: 2016-10-18T21:06:49.322Z: JOB_MESSAGE_BASIC: S05: (4333f43c84da33bd): Executing operation write/WriteImpl/ViewAsSingleton(write|WriteImpl|initialize_write.None)/CreatePCollectionView
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e3f9b: 2016-10-18T21:06:49.371Z: JOB_MESSAGE_DEBUG: (2787e482e4005c04): Value "write/WriteImpl/ViewAsSingleton(write|WriteImpl|initialize_write.None)/CreatePCollectionView.out" materialized.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4622: 2016-10-18T21:06:51.042Z: JOB_MESSAGE_BASIC: S03: (2de4ec76a3165674): Executing operation group/Close
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e463a: 2016-10-18T21:06:51.066Z: JOB_MESSAGE_BASIC: S06: (82f4625d69dc6479): Executing operation write/WriteImpl/GroupByKey/Create
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e46ef: 2016-10-18T21:06:51.247Z: JOB_MESSAGE_DEBUG: (4333f43c84da3c3f): Value "write/WriteImpl/GroupByKey/Session" materialized.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e46fc: 2016-10-18T21:06:51.260Z: JOB_MESSAGE_BASIC: S07: (82f4625d69dc6f73): Executing operation group/Read+group/GroupByWindow+count+format+write/WriteImpl/write_bundles+write/WriteImpl/Map(<lambda at iobase.py:755>)+write/WriteImpl/WindowInto+write/WriteImpl/GroupByKey/Reify+write/WriteImpl/GroupByKey/Write
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e48a0: 2016-10-18T21:06:51.680Z: JOB_MESSAGE_ERROR: (3140c949fe86037b): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4972: 2016-10-18T21:06:51.890Z: JOB_MESSAGE_ERROR: (3140c949fe860af4): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4a16: 2016-10-18T21:06:52.054Z: JOB_MESSAGE_ERROR: (3140c949fe86026d): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4ae3: 2016-10-18T21:06:52.259Z: JOB_MESSAGE_ERROR: (3140c949fe860ef1): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4b8e: 2016-10-18T21:06:52.430Z: JOB_MESSAGE_ERROR: (3140c949fe86066a): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack

INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4bb6: 2016-10-18T21:06:52.470Z: JOB_MESSAGE_DEBUG: (afad86e2b9352547): Executing failure step failure1
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4bb8: 2016-10-18T21:06:52.472Z: JOB_MESSAGE_ERROR: (afad86e2b9352fa5): Workflow failed. Causes: (82f4625d69dc6061): S07:group/Read+group/GroupByWindow+count+format+write/WriteImpl/write_bundles+write/WriteImpl/Map(<lambda at iobase.py:755>)+write/WriteImpl/WindowInto+write/WriteImpl/GroupByKey/Reify+write/WriteImpl/GroupByKey/Write failed.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4bec: 2016-10-18T21:06:52.524Z: JOB_MESSAGE_DETAILED: (5ca87e55b6c649af): Cleaning up.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4c73: 2016-10-18T21:06:52.659Z: JOB_MESSAGE_DEBUG: (5ca87e55b6c645a8): Starting worker pool teardown.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99e4c76: 2016-10-18T21:06:52.662Z: JOB_MESSAGE_BASIC: (5ca87e55b6c6484e): Stopping worker pool...
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99f84fd: 2016-10-18T21:08:12.669Z: JOB_MESSAGE_BASIC: (5ca87e55b6c64447): Worker pool stopped.
INFO:root:2016-10-18_14_04_17-7862377147606629765_00000157d99f88f9: 2016-10-18T21:08:13.689Z: JOB_MESSAGE_DEBUG: (5ca87e55b6c642e6): Tearing down pending resources...
INFO:root:Job 2016-10-18_14_04_17-7862377147606629765 is in state JOB_STATE_FAILED
Traceback (most recent call last):
  File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
    exec code in run_globals
  File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",> line 107, in <module>
    run()
  File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",> line 98, in run
    result = p.run()
  File "apache_beam/pipeline.py", line 159, in run
    return self.runner.run(self)
  File "apache_beam/runners/dataflow_runner.py", line 188, in run
    % getattr(self, 'last_error_msg', None), self.result)
apache_beam.runners.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed:
(3140c949fe86066a): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 474, in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 909, in dataflow_worker.executor.MapTaskExecutor.execute (dataflow_worker/executor.c:24416)
    op.start()
  File "dataflow_worker/executor.py", line 473, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14278)
    def start(self):
  File "dataflow_worker/executor.py", line 500, in dataflow_worker.executor.DoOperation.start (dataflow_worker/executor.c:14093)
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 84, in apache_beam.runners.common.DoFnRunner.__init__ (apache_beam/runners/common.c:3292)
    for side_input in side_inputs]
  File "dataflow_worker/executor.py", line 446, in _read_side_inputs (dataflow_worker/executor.c:12804)
    has_default, default = view_options
ValueError: need more than 1 value to unpack


# Grep will exit with status 1 if success message was not found.
echo ">>> CHECKING JOB SUCCESS"
>>> CHECKING JOB SUCCESS
grep JOB_STATE_DONE job_output
Build step 'Execute shell' marked build as failure