You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/01/09 21:45:43 UTC

Build failed in Jenkins: beam_PostCommit_Python2 #1409

See <https://builds.apache.org/job/beam_PostCommit_Python2/1409/display/redirect?page=changes>

Changes:

[bhulette] [BEAM-9075] add a test case. (#10545)


------------------------------------------
[...truncated 6.83 MB...]
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:36.567Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:39.696Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:42.816Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:45.941Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:45.975Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:46.061Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:46.096Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-010921203-01091320-rcqz-harness-0qjz
      Root cause: Work item failed.,
  beamapp-jenkins-010921203-01091320-rcqz-harness-0qjz
      Root cause: Work item failed.,
  beamapp-jenkins-010921203-01091320-rcqz-harness-0qjz
      Root cause: Work item failed.,
  beamapp-jenkins-010921203-01091320-rcqz-harness-0qjz
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:46.223Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:46.291Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:25:46.326Z: JOB_MESSAGE_BASIC: Stopping worker pool...
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:27:23.885Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:27:23.922Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T21:27:23.949Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-09_13_20_56-2983990982448959685 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3708.465s

FAILED (SKIP=7, errors=7)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 49s
121 actionable tasks: 95 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/vkrrledp7mtnq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python2 #1424

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1424/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1423

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1423/display/redirect?page=changes>

Changes:

[ehudm] junitxml_report: Add failure tag support

[jeff] BEAM-8745 More fine-grained controls for the size of a BigQuery Load job

[apilloud] [BEAM-9027] Unparse DOY/DOW/WEEK Enums properly for ZetaSQL

[github] [BEAM-8490] Fix instance_to_type for empty containers (#9894)


------------------------------------------
[...truncated 6.95 MB...]
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:01.424Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:04.583Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:07.748Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:07.872Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:08.020Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:08.087Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011100194-01101620-813e-harness-2bh5
      Root cause: Work item failed.,
  beamapp-jenkins-011100194-01101620-813e-harness-2bh5
      Root cause: Work item failed.,
  beamapp-jenkins-011100194-01101620-813e-harness-2bh5
      Root cause: Work item failed.,
  beamapp-jenkins-011100194-01101620-813e-harness-2bh5
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:08.324Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:08.491Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:25:08.522Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:26:49.953Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:26:50.050Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-11T00:26:50.118Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-10_16_20_02-6135687770552597234 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3455.613s

FAILED (SKIP=7, errors=7)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_42_52-7996123708400212316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_49_52-14927757147151197217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_58_45-12994962277787255421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_05_30-15607548435297420783?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_11_58-1603125168541972973?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_19_00-9609581316064722828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_26_45-10168860200952177908?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_33_29-8943150910558732124?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_42_57-12419078398436456145?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_56_55-4912920566430986954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_04_26-2080872406019290789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_11_44-14696864795545843271?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_19_17-666630178353633373?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_42_52-7588141246856041612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_03_26-215486481500421115?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_42_56-10677809443310524032?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_55_11-5843543609025775610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_02_10-3468436355068086222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_09_00-3557709778630744238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_15_34-4380262633604987825?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_42_53-10888295342349118025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_58_39-2289351011773536057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_05_47-1978889409738650380?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_12_40-16982970689008856881?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_20_02-6135687770552597234?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_42_53-8326316529949748025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_50_14-9099485866179913215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_57_58-17007652788396891909?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_04_46-4578044080326633530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_11_06-14035430637231861664?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_18_11-18210394755706113267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_42_54-5858539314985851359?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_50_13-16498821665249586136?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_57_49-7604688517986727697?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_04_51-17227273192433724371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_11_50-8086694540876906777?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_18_54-7475398734491038599?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_42_53-15740141439115328914?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_51_22-2070941040007910048?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_01_01-550384172377216989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_09_06-14251051392958236248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_16_16_30-8563643272044605420?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 56s
121 actionable tasks: 98 executed, 20 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/o7jmppfnxhy46

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1422

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1422/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-8575] Added counter tests for CombineFn (#10190)


------------------------------------------
[...truncated 6.30 MB...]
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:37.580Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:40.758Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:41.890Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:41.921Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:42.009Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:42.043Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011023184-01101518-3xb3-harness-tb4m
      Root cause: Work item failed.,
  beamapp-jenkins-011023184-01101518-3xb3-harness-tb4m
      Root cause: Work item failed.,
  beamapp-jenkins-011023184-01101518-3xb3-harness-tb4m
      Root cause: Work item failed.,
  beamapp-jenkins-011023184-01101518-3xb3-harness-tb4m
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:42.172Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:42.263Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:23:42.301Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:25:15.882Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:25:15.922Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T23:25:15.961Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-10_15_18_56-16925227176061397588 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3492.088s

FAILED (SKIP=7, errors=7)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_42_50-8975421320492404950?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_59_33-3555055671297971733?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_06_29-848094499877655307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_12_52-11595501222043089628?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_19_58-13695310994010326425?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_27_03-12698501470115235731?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_33_38-13713574286114466217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_42_54-3501194884615714769?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_57_05-9584503026026753466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_04_42-7401236939430904488?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_12_06-10069421849780681179?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_20_22-15352731780960153908?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_42_50-13039112317283018427?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_02_55-16624173432048352152?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_42_51-8659600673187118578?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_55_19-14556384025661831872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_01_35-5150140669259893482?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_08_09-13673398658625077829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_15_00-10629389194145863475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_42_50-11525270367234772160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_50_31-613627293137494943?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_58_09-5905864814051436266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_05_10-11544844613881843624?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_12_07-1732126021027763184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_20_01-7266361435738237228?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_42_49-10342306155943813753?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_49_36-1111020399590071266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_58_17-14827844823302193022?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_05_27-9801671757316885860?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_12_07-11868748208550691679?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_18_56-16925227176061397588?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_42_50-12413673434092413477?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_50_41-17036061646223826541?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_57_52-15110170661657010890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_04_31-7821426894764825368?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_11_22-15955976065321139531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_17_43-14180952416648169270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_42_50-10997618510135960835?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_14_50_59-14685699240961384980?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_01_11-379703477654685094?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_08_45-14260241251460550281?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_15_15_43-4247937453946387298?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 23s
121 actionable tasks: 97 executed, 21 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/d27ijr2jsu5j4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1421

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1421/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8337] Hard-code Flink versions.

[kcweaver] fix indentation

[kcweaver] Update release guide

[lostluck] [BEAM-9080] Support KVs in the Go SDK's Partition

[github] Rephrasing lull logging to avoid alarming users (#10446)


------------------------------------------
[...truncated 2.25 MB...]
	status = StatusCode.CANCELLED
	details = "Multiplexer hanging up"
	debug_error_string = "{"created":"@1578692896.317617785","description":"Error received from peer ipv4:127.0.0.1:41943","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Multiplexer hanging up","grpc_status":1}"
>

[MapPartition (MapPartition at [1]read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at [1]read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)) (9/16) (9614d586d25d1abffed883c00e4b94b4) switched from RUNNING to FINISHED.
[MapPartition (MapPartition at [1]read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at [1]read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)) (9/16) (9614d586d25d1abffed883c00e4b94b4).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) switched from CREATED to SCHEDULED.
[MapPartition (MapPartition at [1]read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at [1]read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)) (9/16) (9614d586d25d1abffed883c00e4b94b4) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task MapPartition (MapPartition at [1]read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)) 9614d586d25d1abffed883c00e4b94b4.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSink (DiscardingOutput) (9/16) (attempt #0) to 44764a56-50cc-4770-83fd-c5927665fa24 @ localhost (dataPort=-1)
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at [1]read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)) (9/16) (9614d586d25d1abffed883c00e4b94b4) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSink (DiscardingOutput) (9/16).
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) switched from CREATED to DEPLOYING.
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) [DEPLOYING]
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) [DEPLOYING].
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) [DEPLOYING].
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) switched from DEPLOYING to RUNNING.
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) switched from RUNNING to FINISHED.
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb).
[DataSink (DiscardingOutput) (9/16)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSink (DiscardingOutput) 5ab6d26aaa6a34aa2fcb23f941484feb.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (DiscardingOutput) (9/16) (5ab6d26aaa6a34aa2fcb23f941484feb) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-jenkins-0110214746-6e62f396 (faba50cff6c7073255e3358942cdfe96) switched from state RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job faba50cff6c7073255e3358942cdfe96 reached globally terminal state FINISHED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-jenkins-0110214746-6e62f396(faba50cff6c7073255e3358942cdfe96).
[flink-runner-job-invoker] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-invoker] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:6, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 4eb515d336279a0957154c34e6a8b2bb, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 4a0fe245d3379868ad967716d5d4aa93: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPoolImpl - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager 83e2ccc753154dcfedb56a88041c42f1@akka://flink/user/jobmanager_1 for job faba50cff6c7073255e3358942cdfe96 from the resource manager.
[mini-cluster-io-thread-13] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job faba50cff6c7073255e3358942cdfe96 with leader id 83e2ccc753154dcfedb56a88041c42f1 lost leadership.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:5, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: b89ac52b41fd4c4420438808a7574bb5, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:2, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: ce0fc36178c379a2c96c414eff5ab073, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:9, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 107ecd61155d103dcf75706b49c1407b, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:12, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 3299a607f749dd2f403c4d546d889c29, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:3, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: bab783aaab169a9cafb93c8fb430fe70, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:11, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 1d4d22e246858a99e1388f3367defaaf, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:13, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 1bbdf164fde85a80bd1342b2daa5a88f, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:8, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: a1716977a4d7aa309010c717df50c416, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:14, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 60d907da35653dfd4799f806ef766b63, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:0, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: dafc36a6695d07047a0721a8e75ab85c, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:10, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 3fd25d50b7ea7a947993dedc73ee73c0, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:7, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: b7e37f3f5a4b070450c7e6007ffb9ac1, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:1, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: d7c8a09640a7b56b6463bd15d9de9e6e, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:4, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 630c2ecc36ff65aad6736d9648d5f867, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Free slot TaskSlot(index:15, state:ACTIVE, resource profile: ResourceProfile{cpuCores=1.7976931348623157E308, heapMemoryInMB=2147483647, directMemoryInMB=2147483647, nativeMemoryInMB=2147483647, networkMemoryInMB=2147483647, managedMemoryInMB=1017}, allocationId: 3bc99f6127601aa44d443e3d0cf5932e, jobId: faba50cff6c7073255e3358942cdfe96).
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Remove job faba50cff6c7073255e3358942cdfe96 from job leader monitoring.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job faba50cff6c7073255e3358942cdfe96.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close JobManager connection for job faba50cff6c7073255e3358942cdfe96.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Cannot reconnect to job faba50cff6c7073255e3358942cdfe96 because it is not registered.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Close ResourceManager connection 4a0fe245d3379868ad967716d5d4aa93.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Closing TaskExecutor connection 44764a56-50cc-4770-83fd-c5927665fa24 because: The TaskExecutor is shutting down.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-io-d9cef9c4-6566-4792-8720-8bf8b1f62f14
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.io.network.NettyShuffleEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down cluster because application is in CANCELED, diagnostics DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManagerImpl - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.io.disk.FileChannelManagerImpl - FileChannelManager removed spill file directory /tmp/flink-netty-shuffle-df0cba04-e0d8-457f-8844-5095db59547f
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.KvStateService - Shutting down the kvState service and its components.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.filecache.FileCache - removed file cache directory /tmp/flink-dist-cache-493bc8a0-dd6e-4dcd-b435-8379721c9dbd
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-7] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:42009
[flink-akka.actor.default-dispatcher-8] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Execution finished in 24351 msecs
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - Final accumulator values:
[flink-runner-job-invoker] INFO org.apache.beam.runners.flink.FlinkPipelineRunner - __metricscontainers : MetricQueryResults(Counters(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_27}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_26}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_27}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_27}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:1:0}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_27}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_25}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_23}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_5:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_5:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_5:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16:0}: 4, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: 1, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2595>)_19}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2595>)_12}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: 1, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_32}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:1:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2595>)_12}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_21}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2595>)_19}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_19:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2595>)_19}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_21}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2595>)_12}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_5:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_33}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_21}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_25}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/Map(decode)_21}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_25}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_7:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_32}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_32}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Unkey_33}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_1:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_6}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: 4, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_0_25}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 1, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_7:0}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: 1, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_5}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: 1, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/read/ref_PCollection_PCollection_9:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: 1, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_16:1}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_1/SplitAndSize0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: 4, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2595>)_12}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda at core.py:2595>)_19}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: 4, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_26}: 0, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_27}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:1:0}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_27}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_27}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_34}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_34}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_4:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/ProcessSizedElementsAndRestrictions0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_32}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_34}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: 4, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_26}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: 0, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_7:0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: 1, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:element_count:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: 4, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:start_bundle_msecs:v1 {PTRANSFORM=fn/write/ref_PCollection_PCollection_16:0:0}: 0, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Match_34}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_23}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:ptransform_execution_time:total_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_22}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:process_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/ToVoidKey_23}: 0, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/Flatten_27}: 0, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:pardo_execution_time:finish_bundle_msecs:v1 {PTRANSFORM=ref_AppliedPTransform_assert_that/Group/pair_with_1_26}: 0)Distributions(42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1}: DistributionResult{sum=13, count=1, min=13, max=13}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/PairWithRestriction0}: DistributionResult{sum=754, count=1, min=754, max=754}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16:1}: DistributionResult{sum=19, count=1, min=19, max=19}, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=15, count=1, min=15, max=15}, 47read/_PassThroughThenCleanup/Create/Map(decode).None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_8}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=1341, count=1, min=1341, max=1341}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_19}: DistributionResult{sum=145, count=1, min=145, max=145}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/SplitAndSize0, TAG=None}: DistributionResult{sum=1341, count=1, min=1341, max=1341}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_16:0}: DistributionResult{sum=172, count=4, min=40, max=49}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_12}: DistributionResult{sum=152, count=4, min=35, max=44}, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_5}: DistributionResult{sum=13, count=1, min=13, max=13}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_3}: DistributionResult{sum=152, count=4, min=35, max=44}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_20}: DistributionResult{sum=123, count=1, min=123, max=123}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_2}: DistributionResult{sum=152, count=4, min=35, max=44}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_21}: DistributionResult{sum=115, count=1, min=115, max=115}, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_7}: DistributionResult{sum=14, count=1, min=14, max=14}, 42assert_that/Group/GroupByKey/GroupByWindow.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_22}: DistributionResult{sum=14, count=1, min=14, max=14}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PTRANSFORM=ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6/PairWithRestriction0, TAG=None}: DistributionResult{sum=754, count=1, min=754, max=754}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_4}: DistributionResult{sum=0, count=0, min=9223372036854775807, max=-9223372036854775808}, 43read/_PassThroughThenCleanup/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_6}: DistributionResult{sum=14, count=1, min=14, max=14}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_14}: DistributionResult{sum=17, count=1, min=17, max=17}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_11}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_10}: DistributionResult{sum=15, count=1, min=15, max=15}, 26assert_that/Create/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_9}: DistributionResult{sum=13, count=1, min=13, max=13}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_15}: DistributionResult{sum=164, count=4, min=38, max=47}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/SplitAndSize0/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_13}: DistributionResult{sum=156, count=4, min=36, max=45}, 42read/Read/_SDFBoundedSourceWrapper/Impulse.None/beam:env:external:v1:0:beam:metric:sampled_byte_size:v1 {PCOLLECTION=ref_PCollection_PCollection_1/SplitAndSize0}: DistributionResult{sum=1341, count=1, min=1341, max=1341}))
[flink-runner-job-invoker] WARN org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService - Failed to remove job staging directory for token {"sessionId":"job_90ec1e5e-19a0-4e00-8326-df7c27552f0a","basePath":"/tmp/beam-templia8K0/artifactsYySAXn"}: {}
java.io.FileNotFoundException: /tmp/beam-templia8K0/artifactsYySAXn/job_90ec1e5e-19a0-4e00-8326-df7c27552f0a/MANIFEST (No such file or directory)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.<init>(FileInputStream.java:138)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:118)
	at org.apache.beam.sdk.io.LocalFileSystem.open(LocalFileSystem.java:82)
	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:252)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService.loadManifest(BeamFileSystemArtifactRetrievalService.java:88)
	at org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService.removeArtifacts(BeamFileSystemArtifactStagingService.java:92)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver.lambda$createJobService$0(JobServerDriver.java:63)
	at org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService.lambda$run$0(InMemoryJobService.java:201)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.setState(JobInvocation.java:241)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.access$200(JobInvocation.java:48)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:110)
	at org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation$1.onSuccess(JobInvocation.java:96)
	at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.Futures$CallbackListener.run(Futures.java:1058)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... SKIP: This test doesn't work on these runners: ['PortableRunner', 'FlinkRunner']

----------------------------------------------------------------------
XML: nosetests-postCommitIT-flink-py2.xml
----------------------------------------------------------------------
XML: /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python2/src/sdks/python/nosetests.xml
----------------------------------------------------------------------
Ran 4 tests in 72.521s

OK (SKIP=2)

> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/genproto: commit='2b5a72b8730b0b16380010cfe5286c42108d88e7', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='7646b5360d049a7ca31e9133315db43456f39e2e', urls=[https://github.com/grpc/grpc-go]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]
Resolving cached github.com/coreos/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/coreos/etcd.git, git@github.com:coreos/etcd.git]

> Task :sdks:python:test-suites:direct:py2:directRunnerIT
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok

> Task :sdks:go:installDependencies

> Task :sdks:python:test-suites:direct:py2:directRunnerIT
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild

> Task :sdks:java:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python2/src/sdks/go

> Task :sdks:python:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python2/src/sdks/go

> Task :sdks:java:container:installDependencies
> Task :sdks:python:container:installDependencies
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:dockerPrepare

> Task :sdks:python:test-suites:direct:py2:directRunnerIT
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok

----------------------------------------------------------------------
XML: nosetests-directRunnerIT-streaming.xml
----------------------------------------------------------------------
XML: /home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python2/src/sdks/python/nosetests.xml
----------------------------------------------------------------------
Ran 7 tests in 44.353s

OK

> Task :sdks:java:container:docker
Sending build context to Docker daemon  101.4MB
Step 1/9 : FROM openjdk:8
 ---> f8146facf376
Step 2/9 : MAINTAINER "Apache Beam <de...@beam.apache.org>"

> Task :sdks:python:container:buildDarwinAmd64
FATAL: command execution failed
java.io.IOException: Backing channel 'JNLP4-connect connection from 50.182.239.35.bc.googleusercontent.com/35.239.182.50:59454' is disconnected.
	at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:214)
	at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
	at com.sun.proxy.$Proxy141.isAlive(Unknown Source)
	at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1150)
	at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1142)
	at hudson.Launcher$ProcStarter.join(Launcher.java:470)
	at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
	at hudson.model.Build$BuildExecution.build(Build.java:206)
	at hudson.model.Build$BuildExecution.doRun(Build.java:163)
	at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
	at hudson.model.Run.execute(Run.java:1815)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Caused by: java.nio.channels.ClosedChannelException
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
	at org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
	at org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
	at org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
	at org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
	at org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
	at hudson.remoting.Channel.close(Channel.java:1452)
	at hudson.remoting.Channel.close(Channel.java:1405)
	at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:847)
	at hudson.slaves.SlaveComputer.kill(SlaveComputer.java:814)
	at hudson.model.AbstractCIBase.killComputer(AbstractCIBase.java:89)
	at jenkins.model.Jenkins.access$2100(Jenkins.java:312)
	at jenkins.model.Jenkins$19.run(Jenkins.java:3464)
	at hudson.model.Queue._withLock(Queue.java:1379)
	at hudson.model.Queue.withLock(Queue.java:1256)
	at jenkins.model.Jenkins._cleanUpDisconnectComputers(Jenkins.java:3458)
	at jenkins.model.Jenkins.cleanUp(Jenkins.java:3336)
	at hudson.WebAppMain.contextDestroyed(WebAppMain.java:379)
	at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4732)
	at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5396)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1400)
	at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1389)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
	at org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:976)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1400)
	at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1389)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
	at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
	at org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:976)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.core.StandardService.stopInternal(StandardService.java:473)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.core.StandardServer.stopInternal(StandardServer.java:994)
	at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
	at org.apache.catalina.startup.Catalina.stop(Catalina.java:706)
	at org.apache.catalina.startup.Catalina.start(Catalina.java:668)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:344)
	at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:475)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-11 is offline; cannot locate JDK 1.8 (latest)

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1420

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1420/display/redirect?page=changes>

Changes:

[valentyn] Install SDK after tarball is generated to avoid a race in proto stubs

[kcweaver] [BEAM-9070] tests use absolute paths for job server jars

[github] [BEAM-5605] Add support for executing pair with restriction, split


------------------------------------------
[...truncated 6.93 MB...]
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:39.590Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:42.734Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:45.981Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:46.010Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:46.133Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:46.163Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011020324-01101232-w2j4-harness-1p0p
      Root cause: Work item failed.,
  beamapp-jenkins-011020324-01101232-w2j4-harness-1p0p
      Root cause: Work item failed.,
  beamapp-jenkins-011020324-01101232-w2j4-harness-1p0p
      Root cause: Work item failed.,
  beamapp-jenkins-011020324-01101232-w2j4-harness-1p0p
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:46.274Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:46.344Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:37:46.370Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:39:42.136Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:39:42.203Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T20:39:42.260Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-10_12_32_58-11475409924429724657 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3452.318s

FAILED (SKIP=7, errors=7)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_11_56_30-5580659746245792154?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_12_23-4670007265735559019?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_19_29-14088794296853644526?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_26_08-12687394538895651300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_33_04-11969020753189237686?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_39_58-4059714721574216400?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_46_43-8279300571533789358?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_11_56_35-6673114304208438719?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_10_54-16355773949081700131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_18_43-17681080936713579785?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_25_59-11111444092541970949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_11_56_29-13406986387050737107?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_16_00-10635355462123892290?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_11_56_32-1899772072364888128?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_08_32-15779922812970447218?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_14_58-12209532464877393542?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_21_25-1292241430958345319?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_27_46-8800927080809387216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_33_58-10338493925236199702?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_11_56_33-4708028985511787929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_04_16-6591757939072853703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_11_49-13756005780156374239?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_18_19-7387765138657579508?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_25_05-14261348168279377236?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_31_48-6906597372929998491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_11_56_28-16966273933172198006?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_03_09-11508310357416135585?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_12_10-6616156367330492628?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_18_54-1908577128101514090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_25_51-15916855316119046840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_32_46-6841061555840722464?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_11_56_31-1744896207332326580?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_04_14-5782039091379167924?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_12_09-10777259703222007759?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_19_26-3886443990022860219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_26_11-3587104637884857304?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_32_58-11475409924429724657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_11_56_30-1353016261964191222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_05_47-2161315470871884333?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_15_40-10269612003889890658?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_24_03-10698133307395498776?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_12_31_02-15588838281862216238?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 24s
121 actionable tasks: 101 executed, 17 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/fdvk2opm4g2ju

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1419

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1419/display/redirect?page=changes>

Changes:

[lukecwik] [BEAM-8624] Implement Worker Status FnService in Dataflow runner


------------------------------------------
[...truncated 6.81 MB...]
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:45.353Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:48.563Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:51.701Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:53.926Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:53.951Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:54.027Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:54.065Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011019305-01101131-x194-harness-2dhj
      Root cause: Work item failed.,
  beamapp-jenkins-011019305-01101131-x194-harness-2dhj
      Root cause: Work item failed.,
  beamapp-jenkins-011019305-01101131-x194-harness-2dhj
      Root cause: Work item failed.,
  beamapp-jenkins-011019305-01101131-x194-harness-2dhj
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:54.172Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:54.233Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:35:54.271Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:37:48.091Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:37:48.179Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T19:37:48.231Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-10_11_31_17-16250853040119981695 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3569.338s

FAILED (SKIP=7, errors=7)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 40s
121 actionable tasks: 98 executed, 20 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/bgmhjefh3pavg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1418

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1418/display/redirect?page=changes>

Changes:

[suztomo] google_auth_version 0.19.0


------------------------------------------
[...truncated 6.85 MB...]
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:41.153Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:44.297Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:47.467Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:50.604Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:50.632Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:50.715Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:50.751Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011018140-01101014-iagy-harness-wr14
      Root cause: Work item failed.,
  beamapp-jenkins-011018140-01101014-iagy-harness-wr14
      Root cause: Work item failed.,
  beamapp-jenkins-011018140-01101014-iagy-harness-wr14
      Root cause: Work item failed.,
  beamapp-jenkins-011018140-01101014-iagy-harness-wr14
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:50.877Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:50.934Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:18:50.969Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:20:14.433Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:20:14.547Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T18:20:14.596Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-10_10_14_17-10068311502763708081 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3535.982s

FAILED (SKIP=7, errors=7)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 1s
121 actionable tasks: 112 executed, 6 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/3l67r7cqvow3o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1417

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1417/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-9019] Remove BeamCoderWrapper to avoid extra object allocation and


------------------------------------------
[...truncated 6.89 MB...]
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:31.102Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:32.236Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:34.358Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:35.474Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:35.510Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:35.594Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:35.629Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011014414-01100642-uzfs-harness-q5v7
      Root cause: Work item failed.,
  beamapp-jenkins-011014414-01100642-uzfs-harness-q5v7
      Root cause: Work item failed.,
  beamapp-jenkins-011014414-01100642-uzfs-harness-q5v7
      Root cause: Work item failed.,
  beamapp-jenkins-011014414-01100642-uzfs-harness-q5v7
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:35.759Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:35.830Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:46:35.858Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:48:06.164Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:48:06.211Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T14:48:06.254Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-10_06_42_01-18010762563135437123 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3385.301s

FAILED (SKIP=7, errors=7)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 28s
121 actionable tasks: 96 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/sybqbxs75diyg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1416

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1416/display/redirect>

Changes:


------------------------------------------
[...truncated 6.91 MB...]
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:40:55.422Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:40:58.566Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:41:01.716Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:41:01.747Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:41:01.830Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:41:01.858Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011012360-01100436-tivp-harness-2qpn
      Root cause: Work item failed.,
  beamapp-jenkins-011012360-01100436-tivp-harness-2qpn
      Root cause: Work item failed.,
  beamapp-jenkins-011012360-01100436-tivp-harness-2qpn
      Root cause: Work item failed.,
  beamapp-jenkins-011012360-01100436-tivp-harness-2qpn
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:41:01.984Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:41:02.061Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:41:02.094Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:42:30.346Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:42:30.389Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T12:42:30.417Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-10_04_36_17-969509228422670538 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3433.731s

FAILED (SKIP=7, errors=7)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_01_53-17783692402278265938?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_19_08-6473663372273069625?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_26_10-3551311831820891015?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_32_13-10124710168386075645?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_38_49-17451680662806511157?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_45_48-10654806412553335441?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_52_04-10500748046779913867?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_01_59-14191853834660482232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_16_09-10403229632946006552?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_22_47-1759633981557865117?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_30_06-11685812784688036644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_36_46-12573773268164797256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_01_53-15545905967141240443?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_20_40-3081150689668345871?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_01_57-6924501678964856710?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_14_04-7288691185654582210?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_20_26-15623992634368782042?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_26_33-14659012954357791459?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_32_51-11291482598788018676?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_01_54-1358331894309523652?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_09_35-5237471531779872001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_16_59-16438059829459018954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_23_45-4820628629158019156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_30_13-5613820521911756794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_36_17-969509228422670538?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_01_54-2420456110023914917?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_08_39-14709972448967092838?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_17_13-1103738301354188434?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_24_03-12097037561288641371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_30_10-15823832996871682761?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_37_37-12670389120441337890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_01_53-14366920202238114555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_09_32-18199068723850751672?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_16_50-9288540266960311830?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_23_35-9921949355793364649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_30_07-12140667674294002987?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_36_15-4369144364995409224?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_01_55-1198957482589125435?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_10_22-6512877117638211582?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_20_08-7169094900146799782?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_28_06-8801621095411590115?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-10_04_35_05-17103865949994755065?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 23s
121 actionable tasks: 95 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/g2msun3x2liss

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1415

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1415/display/redirect>

Changes:


------------------------------------------
[...truncated 6.91 MB...]
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:42.981Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:46.122Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:48.255Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:48.290Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:48.375Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:48.412Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011006374-01092238-4daw-harness-vfn8
      Root cause: Work item failed.,
  beamapp-jenkins-011006374-01092238-4daw-harness-vfn8
      Root cause: Work item failed.,
  beamapp-jenkins-011006374-01092238-4daw-harness-vfn8
      Root cause: Work item failed.,
  beamapp-jenkins-011006374-01092238-4daw-harness-vfn8
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:48.540Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:48.625Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:42:48.661Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:43:52.569Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:43:52.609Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T06:43:52.633Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-09_22_38_09-10126808673983103622 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3305.995s

FAILED (SKIP=7, errors=7)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_02_26-16447322495795703221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_17_58-2030196662050436077?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_24_31-10802972926639764155?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_30_25-9228505611748811221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_37_12-4412301780621648056?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_44_08-2294801690661289491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_50_30-17369414769206143602?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_02_29-2362075390836132725?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_16_35-10563717473485722479?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_23_23-5506112267574758230?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_30_24-16669834174615877379?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_38_10-8108691714689715542?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_02_25-6658415225672177247?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_19_46-4782708180977179971?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_27_12-17248424612447114079?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_33_56-15311868490543209401?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_02_28-1519083799895954703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_14_50-10994030082446193580?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_21_14-2026635232376125491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_27_30-16222411051952956511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_33_53-5217368297032824020?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_02_25-6614836565935933989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_10_25-15232352793436172082?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_17_56-9719773043994767893?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_24_19-12063877634273316036?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_30_19-2384240430525159632?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_37_08-17463978986759359546?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_02_24-11440427681417862727?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_09_11-11714209711897531588?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_17_45-3169004242285464209?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_24_53-10931141566246287206?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_31_26-2154905377473617441?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_38_09-10126808673983103622?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_02_26-15186980625513578270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_09_57-1078584835521110778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_17_26-9855068352595992630?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_24_13-409371765839097300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_30_53-9729671865337084776?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_37_16-15140968137817659352?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_02_26-2097467092838045396?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_11_10-8717408455339047703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_22_20_24-13869527120443016648?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 44s
121 actionable tasks: 95 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/xywdkufwqigjm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1414

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1414/display/redirect?page=changes>

Changes:

[hannahjiang] BEAM-7861 add direct_running_mode option

[hannahjiang] [BEAM-7861] rephrase direct_running_mode option checking


------------------------------------------
[...truncated 6.89 MB...]
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:10:57.457Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:11:00.587Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:11:00.615Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:11:00.702Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:11:00.724Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011002591-01091859-sdba-harness-78jn
      Root cause: Work item failed.,
  beamapp-jenkins-011002591-01091859-sdba-harness-78jn
      Root cause: Work item failed.,
  beamapp-jenkins-011002591-01091859-sdba-harness-78jn
      Root cause: Work item failed.,
  beamapp-jenkins-011002591-01091859-sdba-harness-78jn
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:11:00.868Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:11:00.942Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:11:00.980Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:12:54.520Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:12:54.564Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T03:12:54.590Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-09_18_59_26-9359700799782406592 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3663.018s

FAILED (SKIP=7, errors=7)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_16_45-3979721268312461149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_35_58-13758097985263550229?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_16_51-9807096945040819151?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_31_06-17612238535885340917?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_38_21-13231400272833310817?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_45_25-16575211714430932013?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_52_26-5375066440405168630?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_59_26-9359700799782406592?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_16_50-3992732789342683821?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_29_21-14950809923278860954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_35_34-16919018308254821057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_41_55-6669210719051602520?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_48_41-7955031090234210844?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_55_11-9085533469986008676?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_16_47-6238157903328395114?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_35_20-6112464715226399154?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_42_56-4806278534423702163?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_49_52-5666889175675533420?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_16_46-6099089054750171896?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_24_30-8315403524638093850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_32_53-1038752012249556199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_48_09-1604968080685404181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_55_10-2184016062438991526?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_16_44-16286160827079094434?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_23_49-650174842660742089?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_32_21-15338120225853131766?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_38_52-15207629334287572499?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_45_44-5697290132739590235?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_52_57-14799139089390600411?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_16_47-6236768032336403656?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_24_49-217919831019543886?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_32_53-6275417335505891660?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_39_54-978720051949627998?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_47_30-15985572337248500946?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_16_49-15093574862526365493?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_25_48-13058412600466317176?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_35_07-18141787469430219614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_42_08-9312646479512936773?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_48_39-17887106476875160886?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_55_41-13383058267666226872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_19_02_52-3953165525430013073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_19_09_56-3258730898434279413?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 50

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 2m 14s
121 actionable tasks: 96 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/wt3kqeainijre

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1413

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1413/display/redirect>

Changes:


------------------------------------------
[...truncated 6.74 MB...]
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T01:59:47.130Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T01:59:50.271Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T01:59:50.334Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T01:59:50.450Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T01:59:50.483Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-011001531-01091753-mvxj-harness-nhg5
      Root cause: Work item failed.,
  beamapp-jenkins-011001531-01091753-mvxj-harness-nhg5
      Root cause: Work item failed.,
  beamapp-jenkins-011001531-01091753-mvxj-harness-nhg5
      Root cause: Work item failed.,
  beamapp-jenkins-011001531-01091753-mvxj-harness-nhg5
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T01:59:50.694Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T01:59:50.886Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T01:59:50.924Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T02:01:36.491Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T02:01:36.535Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T02:01:36.570Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-09_17_53_30-6536203367733754351 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3450.313s

FAILED (SKIP=7, errors=8)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_17_37-14794900603936921793?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_31_35-7408203168448505685?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_38_53-2563885517920948135?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_47_05-3148757454045031797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_53_30-6536203367733754351?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_17_31-6035396791350475668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_34_37-12479945668367126023?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_41_49-5487158506757234458?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_48_43-53411210690198958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_17_34-4917891871786054032?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_29_58-16270182246554908681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_36_11-13135479145634698022?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_42_26-655290886463624036?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_48_55-3675322291413138852?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_17_35-14464850439267277894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_34_19-12084019717893192969?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_42_01-11777188070487700295?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_48_55-8136903376454392543?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_17_31-16034199067287830527?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_24_50-4252958181770528842?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_32_17-3892576915175031510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_39_12-9966924612950321103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_46_15-945398898957862827?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_53_02-5231484940929436184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_17_31-5552887013559080381?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_24_05-17827105700281186688?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_33_10-2246620685666483553?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_40_20-12445910854413355618?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_46_37-13659085044612247153?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_53_58-13732225265318736148?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_17_31-9980420167638050193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_26_03-17669977599945552935?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_33_34-4113186478924630357?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_40_09-14922013667854678053?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_46_18-3961544936034152404?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_53_34-14856893732657578299?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_01_37-3982243787771459114?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_18_08_18-286196390638364563?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_17_31-1104388445818080397?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_25_52-11295790188051848121?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_17_35_41-1915209404024413268?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'> line: 166

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py2:postCommitPy2IT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58m 42s
121 actionable tasks: 95 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/dx3nshgcvf2vu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1412

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1412/display/redirect?page=changes>

Changes:

[kirillkozlov] Missing commit


------------------------------------------
[...truncated 6.83 MB...]
            "name": "funcsigs-1.0.2.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0110001856-840532.1578615536.840657/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0110001856-840532.1578615536.840657/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20191220"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0110001856-840532", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT bytes, date, time FROM [python_query_to_table_15786155321575.python_new_types_table]", 
        "bigquery_use_legacy_sql": true, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT bytes, date, time FROM [python_query_to_table_15786155321575.python_new_types_table]"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_1"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15786155321575", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLaqML8nPzynmCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwBKpRfo", 
              "component_encodings": [], 
              "pipeline_proto_coder_id": "ref_Coder_RowAsDictJsonCoder_3"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"BYTES\", \"name\": \"bytes\", \"mode\": \"NULLABLE\"}, {\"type\": \"DATE\", \"name\": \"date\", \"mode\": \"NULLABLE\"}, {\"type\": \"TIME\", \"name\": \"time\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/WriteToBigQuery/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-01-10T00:19:17.516023Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-01-09_16_19_15-2140153580046151543'
 location: u'us-central1'
 name: u'beamapp-jenkins-0110001856-840532'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-01-10T00:19:17.516023Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-01-09_16_19_15-2140153580046151543]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_19_15-2140153580046151543?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-09_16_19_15-2140153580046151543 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:15.714Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-01-09_16_19_15-2140153580046151543. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:15.714Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-01-09_16_19_15-2140153580046151543.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:19.916Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:21.316Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.023Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.072Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.106Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.142Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.379Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.416Z: JOB_MESSAGE_DETAILED: Fusing consumer write/WriteToBigQuery/NativeWrite into read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.452Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.481Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.507Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.535Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.702Z: JOB_MESSAGE_DEBUG: Executing wait step start3
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.904Z: JOB_MESSAGE_BASIC: Executing operation read+write/WriteToBigQuery/NativeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.953Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:22.990Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:26.508Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_14045176391248151878". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_14045176391248151878".
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:39.679Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:19:46.932Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:20:49.327Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:20:49.362Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:25:22.670Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-10T00:31:22.670Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-01-09_16_19_15-2140153580046151543 after 902 seconds
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3576.403s

FAILED (SKIP=7, errors=7, failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_15_57_53-12302648891512666695?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_14_17-4308098122410931379?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_21_27-10325531382334762216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_28_25-13918820796813867376?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_35_46-16225185457949771160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_42_53-6283410656817011841?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_49_57-13353904242074256010?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_15_57_56-11811299229191841644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_19_07-13328754708269136832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_15_57_57-7185353330452110822?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_12_12-9112992841148741640?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_19_15-2140153580046151543?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_36_17-9116774514998059138?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_43_13-15348346257439952497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_15_57_51-7769643122952477211?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_13_58-16536799839945204223?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_21_30-13723246676390294110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_28_09-2616301363384829398?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_36_22-4388437358867526171?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_15_57_51-3570375453309229354?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_05_16-1728707360298088774?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_13_26-914782496162647465?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_21_03-11139342908500977827?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_27_16-9580357077994719448?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_33_44-8398210128926348644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_15_57_50-2108206644690011925?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_04_29-9071932337300457267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_13_32-9484230084594221510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_21_06-3973423949911596410?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_28_33-10631555177952340535?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_34_58-12988138620186204322?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_15_57_52-15647392159273265102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_05_55-11245994855557583481?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_13_40-15798959857993387998?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_21_07-6723824947046750716?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_27_26-13689291860984342055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_35_29-228840315820859120?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_15_57_50-17438050572062278882?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_06_06-16230376237483630883?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_15_54-14810192987023189969?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_24_09-15731553004801096261?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-09_16_31_04-11087073214546904290?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 46s
121 actionable tasks: 95 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/mxjk4priohqlm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1411

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1411/display/redirect?page=changes>

Changes:

[kirillkozlov] Add a new Jenkins job for SQL perf tests

[kirillkozlov] Test boilerplate

[kirillkozlov] Table proxy to add TimeMonitor after the IO

[kirillkozlov] Tests for direct_read w/o push-down and default methods

[kirillkozlov] Cleanup

[kirillkozlov] Monitor total number of fields read from an IO

[kirillkozlov] Metric name should not be constant


------------------------------------------
Started by GitHub push by apilloud
Started by GitHub push by apilloud
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python2/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 740534bbc9035bd9245e1ecf48641d6f0991e2ae (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 740534bbc9035bd9245e1ecf48641d6f0991e2ae
Commit message: "Merge pull request #10226: [BEAM-8844] Add a new Jenkins job for SQL perf tests"
 > git rev-list --no-walk a41efbf13a8dd4c52b1f9a3af4bb1183f536c644 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python2PostCommit
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Configure project :sdks:python:container
Found go 1.12 in /usr/bin/go, use it.

FAILURE: Build failed with an exception.

* What went wrong:
Could not determine the dependencies of task ':runners:spark:job-server:shadowJar'.
> Could not resolve all dependencies for configuration ':runners:spark:job-server:runtimeClasspath'.
   > Could not resolve net.minidev:json-smart:[1.3.1,2.3].
     Required by:
         project :runners:spark:job-server > project :runners:spark > org.apache.hadoop:hadoop-common:2.8.5 > org.apache.hadoop:hadoop-auth:2.8.5 > com.nimbusds:nimbus-jose-jwt:4.41.1
      > Could not resolve net.minidev:json-smart:2.3-SNAPSHOT.
         > Unable to load Maven meta-data from https://oss.sonatype.org/content/repositories/staging/net/minidev/json-smart/2.3-SNAPSHOT/maven-metadata.xml.
            > Could not get resource 'https://oss.sonatype.org/content/repositories/staging/net/minidev/json-smart/2.3-SNAPSHOT/maven-metadata.xml'.
               > Could not GET 'https://oss.sonatype.org/content/repositories/staging/net/minidev/json-smart/2.3-SNAPSHOT/maven-metadata.xml'.
                  > Read timed out

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 48s

Publishing build scan...
https://gradle.com/s/ctltrboeiyc7g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python2 #1410

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python2/1410/display/redirect?page=changes>

Changes:

[ehudm] Small fixes to verify_release_build.sh

[12602502+Ardagan] [BEAM-8821] Document Python SDK 2.17.0 deps (#10212)


------------------------------------------
[...truncated 6.84 MB...]
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:17.561Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:20.687Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:23.820Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:26.946Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 647, in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 176, in execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 649, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 651, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 652, in apache_beam.runners.worker.operations.DoOperation.start
    super(DoOperation, self).start()
  File "apache_beam/runners/worker/operations.py", line 261, in apache_beam.runners.worker.operations.Operation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 266, in apache_beam.runners.worker.operations.Operation.start
    self.setup()
  File "apache_beam/runners/worker/operations.py", line 597, in apache_beam.runners.worker.operations.DoOperation.setup
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 602, in apache_beam.runners.worker.operations.DoOperation.setup
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 290, in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 275, in loads
    return load(file, ignore, **kwds)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 270, in load
    return Unpickler(file, ignore=ignore, **kwds).load()
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 472, in load
    obj = StockUnpickler.load(self)
  File "/usr/lib/python2.7/pickle.py", line 864, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1139, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/_dill.py", line 827, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py", line 26, in <module>
    from hamcrest.library.number.ordering_comparison import greater_than
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/__init__.py", line 2, in <module>
    from hamcrest.library import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/__init__.py", line 7, in <module>
    from hamcrest.library.object import *
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/__init__.py", line 4, in <module>
    from .hasproperty import has_properties, has_property
  File "/usr/local/lib/python2.7/dist-packages/hamcrest/library/object/hasproperty.py", line 174
    ),
    ^
SyntaxError: invalid syntax

apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:26.971Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:27.032Z: JOB_MESSAGE_DEBUG: Executing failure step failure12
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:27.056Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S02:Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write failed., The job failed because a work item has failed 4 times. Look in previous log entries for the cause of each one of the 4 failures. For more information, see https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was attempted on these workers: 
  beamapp-jenkins-010922332-01091433-p3x5-harness-m7ds
      Root cause: Work item failed.,
  beamapp-jenkins-010922332-01091433-p3x5-harness-m7ds
      Root cause: Work item failed.,
  beamapp-jenkins-010922332-01091433-p3x5-harness-m7ds
      Root cause: Work item failed.,
  beamapp-jenkins-010922332-01091433-p3x5-harness-m7ds
      Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:27.158Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:27.210Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:38:27.227Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:39:52.814Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:39:52.847Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T22:39:52.871Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-09_14_33_49-12510686855371423051 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3616.865s

FAILED (SKIP=7, errors=7)

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 1m 19s
121 actionable tasks: 95 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/t3iimaxl62tge

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org