You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/04/13 18:35:22 UTC

Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py27 #1329

See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/1329/display/redirect?page=changes>

Changes:

[suztomo] google-api-client 1.30.9

[chamikara] Updates Dataflow stateful DoFn setup to support external transforms

[github] [BEAM-9738] Update dataflow to setup correct docker environment options.

[github] [BEAM-9136]Add licenses for dependencies for Java (#11243)


------------------------------------------
[...truncated 63.93 KB...]
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/test-suites/dataflow/py2> Command: sh -c . <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/build/gradleenv/-194514014/bin/activate> && <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/scripts/run_integration_test.sh> --test_opts "--tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture" --pipeline_opts "--project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz" --suite integrationTest-perf
Successfully started process 'command 'sh''
>>> RUNNING integration tests with pipeline options: --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --input=gs://apache-beam-samples/input_small_files/ascii_sort_1MB_input.0000* --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --expect_checksum=ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710 --num_workers=10 --autoscaling_algorithm=NONE --runner=TestDataflowRunner --sdk_location=build/apache-beam.tar.gz
>>>   test options: --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --attr=IT --nocapture
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

> Task :sdks:python:test-suites:dataflow:py2:integrationTest FAILED
:sdks:python:test-suites:dataflow:py2:integrationTest (Thread[Execution worker for ':',5,main]) completed. Took 2.825 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/bkvttokd6unxg


STDERR: DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
setup.py:247: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.22.0.dev' to '2.22.0.dev0'
  normalized_version,
INFO:gen_protos:Regenerating Python proto definitions (no output files).
INFO:gen_protos:Found protoc_gen_mypy at <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/build/gradleenv/1922375555/bin/protoc-gen-mypy>
metrics.proto: warning: Import google/protobuf/timestamp.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
beam_interactive_api.proto: warning: Import google/protobuf/timestamp.proto but not used.
Writing mypy to endpoints_pb2.pyi
Writing mypy to external_transforms_pb2.pyi
Writing mypy to beam_provision_api_pb2.pyi
Writing mypy to beam_runner_api_pb2.pyi
Writing mypy to standard_window_fns_pb2.pyi
Writing mypy to beam_artifact_api_pb2.pyi
Writing mypy to beam_fn_api_pb2.pyi
Writing mypy to metrics_pb2.pyi
Writing mypy to schema_pb2.pyi
Writing mypy to beam_job_api_pb2.pyi
Writing mypy to beam_interactive_api_pb2.pyi
Writing mypy to beam_expansion_api_pb2.pyi
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_urns.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
setup.py:247: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.22.0.dev' to '2.22.0.dev0'
  normalized_version,
INFO:gen_protos:Skipping proto regeneration: all files up to date
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/__init__.py>:82: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:root:Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
INFO:root:Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR

======================================================================
ERROR: test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 50, in test_wordcount_it
    self._run_wordcount_it(wordcount.run)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 57, in _run_wordcount_it
    test_pipeline = TestPipeline(is_integration_test=True)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 107, in __init__
    super(TestPipeline, self).__init__(runner, options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/apache_beam/pipeline.py",> line 202, in __init__
    'Pipeline has validations errors: \n' + '\n'.join(errors))
ValueError: Pipeline has validations errors: 
Missing required option: region.
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-integrationTest-perf.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 1.078s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1m 8s

2020-04-13 18:35:20,808 88f65927 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 846, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 689, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 161, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 96, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2020-04-13 18:35:20,809 88f65927 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2020-04-13 18:35:20,811 88f65927 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 995, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 846, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 689, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 161, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 96, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2020-04-13 18:35:20,811 88f65927 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2020-04-13 18:35:20,811 88f65927 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2020-04-13 18:35:20,812 88f65927 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/runs/88f65927/pkb.log>
2020-04-13 18:35:20,812 88f65927 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/runs/88f65927/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_WordCountIT_Py27 #1331

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/1331/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_WordCountIT_Py27 #1330

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/1330/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-9496] Evaluation of deferred dataframes via Beam operations.

[robertwb] Fix and test tuple inputs and outputs.

[robertwb] Comments and clarification.

[kcweaver] [BEAM-9744] Add missing region option to py perf tests.

[lcwik] [BEAM-9562] Fix output timestamp to be inferred from scheduled time when

[kcweaver] [BEAM-9744] Remove --region option from SQL tests.


------------------------------------------
[...truncated 142.75 KB...]
              ], 
              "is_stream_like": {
                "value": true
              }
            }, 
            "output_name": "out", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s17"
        }, 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)", 
        "windowing_strategy": "%0AB%22%40%0A%1Dref_Coder_GlobalWindowCoder_1%12%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jQ%0A%22%0A%20beam%3Awindow_fn%3Aglobal_windows%3Av1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_finalize_write"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {
          "python_side_input0-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s18"
          }, 
          "python_side_input1-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s19"
          }, 
          "python_side_input2-write/Write/WriteImpl/FinalizeWrite": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "SideInput-s20"
          }
        }, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "write/Write/WriteImpl/FinalizeWrite.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 3004 bytes>", 
        "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-04-14T00:34:18.932728Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-13_17_34_17-12076179844755681561'
 location: u'us-central1'
 name: u'beamapp-jenkins-0414003415-755223'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-14T00:34:18.932728Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-04-13_17_34_17-12076179844755681561]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-13_17_34_17-12076179844755681561?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-04-13_17_34_17-12076179844755681561 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:20.962Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:21.479Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.342Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.384Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.422Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.456Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.488Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.579Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.639Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.663Z: JOB_MESSAGE_DETAILED: Fusing consumer split into read/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.695Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.725Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Reify into pair_with_one
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.757Z: JOB_MESSAGE_DETAILED: Fusing consumer group/Write into group/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.795Z: JOB_MESSAGE_DETAILED: Fusing consumer group/GroupByWindow into group/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.833Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.872Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.914Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.955Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:22.992Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.028Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.065Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.093Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.116Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.145Z: JOB_MESSAGE_DETAILED: Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.179Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.202Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.222Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.252Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.382Z: JOB_MESSAGE_DEBUG: Executing wait step start26
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.444Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.477Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.481Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.507Z: JOB_MESSAGE_BASIC: Executing operation group/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.507Z: JOB_MESSAGE_BASIC: Starting 10 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.553Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.553Z: JOB_MESSAGE_BASIC: Finished operation group/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.608Z: JOB_MESSAGE_DEBUG: Value "group/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.636Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:23.676Z: JOB_MESSAGE_BASIC: Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:36.945Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:47.880Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 8 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:47.922Z: JOB_MESSAGE_DETAILED: Resized worker pool to 8, though goal was 10.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:53.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:36:03.191Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:36:03.231Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.368Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.444Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Read.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.477Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/InitializeWrite.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.539Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.577Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.603Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.607Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.642Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.669Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.679Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.719Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.754Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:40:23.359Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
Terminated
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=6b670da3-ad67-4ca8-b775-db17b5e053fb, currentDir=<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 16780
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-16780.out.log
----- Last  20 lines from daemon log file - daemon-16780.out.log -----
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:47.880Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 8 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:47.922Z: JOB_MESSAGE_DETAILED: Resized worker pool to 8, though goal was 10.  This could be a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:34:53.436Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:36:03.191Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:36:03.231Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.368Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.444Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/DoOnce/Read.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.477Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/InitializeWrite.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.539Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.577Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.603Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.607Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.642Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.669Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.679Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.719Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:39:15.754Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0).output" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-14T00:40:23.359Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
Terminated
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

2020-04-14 00:40:31,174 60bf10e0 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 846, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 689, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 161, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 96, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2020-04-14 00:40:31,175 60bf10e0 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2020-04-14 00:40:31,177 60bf10e0 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 995, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 846, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 689, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 161, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 96, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2020-04-14 00:40:31,177 60bf10e0 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2020-04-14 00:40:31,178 60bf10e0 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2020-04-14 00:40:31,178 60bf10e0 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/runs/60bf10e0/pkb.log>
2020-04-14 00:40:31,178 60bf10e0 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/runs/60bf10e0/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org