You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/05/03 18:43:27 UTC

Build failed in Jenkins: beam_PerformanceTests_Python35 #38

See <https://builds.apache.org/job/beam_PerformanceTests_Python35/38/display/redirect?page=changes>

Changes:

[aromanenko.dev] [BEAM-6247] Remove deprecated module “hadoop-input-format”

[mxm] [BEAM-7192] Fix partitioning of buffered elements during checkpointing

[github] Update setup.py for pyarrow

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 896402425b36b54e2cf00249fec2d84ec86c7e63 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 896402425b36b54e2cf00249fec2d84ec86c7e63
Commit message: "Merge pull request #8484 from apache/aaltay-patch-1"
 > git rev-list --no-walk 0606ba825c30eeca23841de238bb262be08c77f3 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8979878885477218533.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4851100477608529538.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins7914079906613552667.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8717630015557691896.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4139773286404126802.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2405727172268446775.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6894883881085159476.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-03 18:43:24,086 d9c03b49 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/d9c03b49/pkb.log>
2019-05-03 18:43:24,086 d9c03b49 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-03 18:43:24,087 d9c03b49 MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-03 18:43:24,437 d9c03b49 MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-03 18:43:24,459 d9c03b49 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-03 18:43:24,481 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-03 18:43:24,483 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-03 18:43:24,483 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-03 18:43:24,494 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-03 18:43:26,441 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.852 secs.
The client will now receive all logging from the daemon (pid: 28303). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-28303.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-03 18:43:26,442 d9c03b49 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-03 18:43:26,443 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-03 18:43:26,444 d9c03b49 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-03 18:43:26,445 d9c03b49 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-03 18:43:26,445 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-03 18:43:26,445 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/d9c03b49/pkb.log>
2019-05-03 18:43:26,445 d9c03b49 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/d9c03b49/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_Python35 #52

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/52/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #51

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/51/display/redirect>

------------------------------------------
[...truncated 153.21 KB...]
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: xreadlines
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: zip
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: absolute_import
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: basestring
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: cmp
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: division_safe
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: execfile
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: future_builtins
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: future_standard_library
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: future_standard_library_urllib
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: metaclass
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: next_call
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: object
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: print_with_import
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: raise
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: unicode_keep_u
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: xrange_with_import
root: Generating grammar tables from /usr/lib/python3.5/lib2to3/PatternGrammar.txt
RefactoringTool: Adding transformation: newstyle
RefactoringTool: Descending into <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/__init__.py>
RefactoringTool: No changes in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/__init__.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: No changes in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/endpoints_pb2_grpc.py>
RefactoringTool: No changes in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/endpoints_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/external_transforms_pb2_grpc.py>
RefactoringTool: No changes in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/external_transforms_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/metrics_pb2_grpc.py>
RefactoringTool: No changes in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/metrics_pb2_grpc.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Wrote changes to <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Refactoring <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/standard_window_fns_pb2_grpc.py>
RefactoringTool: No changes in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/standard_window_fns_pb2_grpc.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build/srcs/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
setup.py:177: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/build/gradleenv/-1709362674/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: cmd: standard file not found: should have one of README, README.rst, README.txt, README.md

setup.py:177: UserWarning: Python 3 support for the Apache Beam SDK is not yet fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/build/gradleenv/-1709362674/lib/python3.5/site-packages/setuptools/dist.py>:472: UserWarning: Normalizing '2.13.0.dev' to '2.13.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/__init__.py>:84: UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ERROR

======================================================================
ERROR: test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 49, in test_wordcount_it
    self._run_wordcount_it(wordcount.run)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/examples/wordcount_it_test.py",> line 81, in _run_wordcount_it
    run_wordcount(test_pipeline.get_full_options_as_args(**extra_opts))
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 114, in run
    result = p.run()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/pipeline.py",> line 406, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/pipeline.py",> line 419, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 53, in run_pipeline
    pipeline, options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 460, in run_pipeline
    self.dataflow_client.create_job(self.job), self)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 519, in create_job
    self.create_job_description(job)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 549, in create_job_description
    resources = self._stage_resources(job.options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 479, in _stage_resources
    staging_location=google_cloud_options.staging_location)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 273, in stage_job_resources
    'the --sdk_location command-line option.' % sdk_path)
RuntimeError: The file "test-suites/dataflow/py3/build/apache-beam.tar.gz" cannot be found. Its location was specified by the --sdk_location command-line option.
-------------------- >> begin captured logging << --------------------
avro.schema: Level 5: Register new name for 'org.apache.avro.file.Header'
avro.schema: Level 5: Register new name for 'org.apache.avro.file.magic'
avro.schema: Level 5: Register new name for 'org.apache.avro.file.sync'
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://dataflow-samples/shakespeare/kinglear.txt' -> 'gs\\:\\/\\/dataflow\\-samples\\/shakespeare\\/kinglear\\.txt'
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: DEBUG: Connecting using Google Application Default Credentials.
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://dataflow-samples/shakespeare/kinglear.txt' -> 'gs\\:\\/\\/dataflow\\-samples\\/shakespeare\\/kinglear\\.txt'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0506191654-225551.1557170214.225730/pipeline.pb...
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0506191654-225551.1557170214.225730/pipeline.pb in 0 seconds.
root: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0506191654-225551.1557170214.225730/pickled_main_session...
root: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0506191654-225551.1557170214.225730/pickled_main_session in 0 seconds.
apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1557170213817/results'
apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1557170213817/results*' -> 'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1557170213817\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.06185793876647949 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 1.136s

FAILED (errors=1)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-python-test-suites-dataflow-py35:integrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 53s

2019-05-06 19:16:56,440 6eba1e6c MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 19:16:56,441 6eba1e6c MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-06 19:16:56,443 6eba1e6c MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 19:16:56,443 6eba1e6c MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-06 19:16:56,444 6eba1e6c MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-06 19:16:56,444 6eba1e6c MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/6eba1e6c/pkb.log>
2019-05-06 19:16:56,444 6eba1e6c MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/6eba1e6c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #50

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/50/display/redirect?page=changes>

Changes:

[bhulette] Fix EXCEPT DISTINCT behavior

[juta.staes] [BEAM-7066] re-add python 3.6 and 3.7 precommit test suites

[github] Adding PyDoc to CombiningValueStateSpec (#8477)

[iemejia] Add UsesSchema category to Schema transform tests

[pabloem] Skipping fileio tests on windows

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-3 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision fb88549b7c1badf37d6a3e3e2be53281faa7c893 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f fb88549b7c1badf37d6a3e3e2be53281faa7c893
Commit message: "Merge pull request #8507: Add UsesSchema category to Schema transform tests"
 > git rev-list --no-walk 54f89e1cd4af8cede0c37ab214ce0f9cdd023a3d # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins779467439372021237.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3195798771440116687.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins89360714088893282.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4962302095098910400.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8395432095251715117.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins5954301135872891294.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins104174818947838035.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-06 18:43:35,846 f6c9ddf4 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/f6c9ddf4/pkb.log>
2019-05-06 18:43:35,847 f6c9ddf4 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-06 18:43:35,848 f6c9ddf4 MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-06 18:43:36,100 f6c9ddf4 MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-06 18:43:36,122 f6c9ddf4 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-06 18:43:36,146 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-06 18:43:36,148 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-06 18:43:36,149 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-06 18:43:36,159 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-06 18:43:38,226 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.938 secs.
The client will now receive all logging from the daemon (pid: 22783). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-22783.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-06 18:43:38,227 f6c9ddf4 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 18:43:38,228 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-06 18:43:38,230 f6c9ddf4 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 18:43:38,230 f6c9ddf4 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-06 18:43:38,230 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-06 18:43:38,230 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/f6c9ddf4/pkb.log>
2019-05-06 18:43:38,230 f6c9ddf4 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/f6c9ddf4/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #49

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/49/display/redirect?page=changes>

Changes:

[relax] Fix non-determistic row access

[iemejia] Refine Spark ValidatesRunner exclusions

[iemejia] [BEAM-7227] Instantiate PipelineRunner from options to support other

[iemejia] Categorize missing unbounded NeedsRunner tests in sdks/java/core

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 54f89e1cd4af8cede0c37ab214ce0f9cdd023a3d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 54f89e1cd4af8cede0c37ab214ce0f9cdd023a3d
Commit message: "Merge pull request #8489: Categorize missing unbounded NeedsRunner tests in sdks/java/core"
 > git rev-list --no-walk 207ac338b12b90fd6095cebc609f3c453b2980f8 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins5887700143534758870.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8414239899605930253.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3640404893099052996.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6314302909219633659.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2252568880071670482.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8176884321527524863.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6312321465844905632.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-06 12:43:24,001 83e65e38 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/83e65e38/pkb.log>
2019-05-06 12:43:24,002 83e65e38 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-06 12:43:24,003 83e65e38 MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-06 12:43:24,246 83e65e38 MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-06 12:43:24,268 83e65e38 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-06 12:43:24,291 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-06 12:43:24,293 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-06 12:43:24,294 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-06 12:43:24,304 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-06 12:43:26,282 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon (subsequent builds will be faster)
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.898 secs.
The client will now receive all logging from the daemon (pid: 32005). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-32005.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-06 12:43:26,283 83e65e38 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 12:43:26,284 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-06 12:43:26,286 83e65e38 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 12:43:26,287 83e65e38 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-06 12:43:26,287 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-06 12:43:26,287 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/83e65e38/pkb.log>
2019-05-06 12:43:26,287 83e65e38 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/83e65e38/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #48

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/48/display/redirect?page=changes>

Changes:

[heejong] [BEAM-7102] Adding `jar_packages` experiment option for Python SDK

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 207ac338b12b90fd6095cebc609f3c453b2980f8 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 207ac338b12b90fd6095cebc609f3c453b2980f8
Commit message: "Merge pull request #8340: [BEAM-7102] Adding `jar_packages` experiment option for Python SDK"
 > git rev-list --no-walk 9488352398ec95983b1d878f5d1df56821f9045e # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3158502267185679379.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4148624106250129853.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins356492422700562725.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1357919245834401008.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins7871843382450414942.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4862111194101047915.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8474666203615857386.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-06 06:43:23,910 b0d6b40b MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/b0d6b40b/pkb.log>
2019-05-06 06:43:23,911 b0d6b40b MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-06 06:43:23,912 b0d6b40b MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-06 06:43:24,075 b0d6b40b MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-06 06:43:24,097 b0d6b40b MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-06 06:43:24,119 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-06 06:43:24,121 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-06 06:43:24,122 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-06 06:43:24,132 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-06 06:43:26,172 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.942 secs.
The client will now receive all logging from the daemon (pid: 18886). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-18886.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-06 06:43:26,173 b0d6b40b MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 06:43:26,174 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-06 06:43:26,175 b0d6b40b MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 06:43:26,176 b0d6b40b MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-06 06:43:26,176 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-06 06:43:26,176 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/b0d6b40b/pkb.log>
2019-05-06 06:43:26,177 b0d6b40b MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/b0d6b40b/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #47

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/47/display/redirect?page=changes>

Changes:

[robert] fix format string errors with errors package.

[robert] Update Go protos.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9488352398ec95983b1d878f5d1df56821f9045e (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9488352398ec95983b1d878f5d1df56821f9045e
Commit message: "Merge pull request #8487: Update Go protos + minor fix."
 > git rev-list --no-walk 98453a0da534be0bc632bfd017377062a74d59de # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins7168684077446177.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6900605316859716239.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8911907338347269459.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3466406800055386611.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1745061916974531497.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1617276572696005737.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3138994156497469948.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-06 00:43:22,932 35de572c MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/35de572c/pkb.log>
2019-05-06 00:43:22,932 35de572c MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-06 00:43:22,934 35de572c MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-06 00:43:23,166 35de572c MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-06 00:43:23,188 35de572c MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-06 00:43:23,210 35de572c MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-06 00:43:23,212 35de572c MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-06 00:43:23,213 35de572c MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-06 00:43:23,223 35de572c MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-06 00:43:25,260 35de572c MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon (subsequent builds will be faster)
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.938 secs.
The client will now receive all logging from the daemon (pid: 31206). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-31206.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-06 00:43:25,260 35de572c MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 00:43:25,261 35de572c MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-06 00:43:25,263 35de572c MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-06 00:43:25,263 35de572c MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-06 00:43:25,264 35de572c MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-06 00:43:25,264 35de572c MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/35de572c/pkb.log>
2019-05-06 00:43:25,264 35de572c MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/35de572c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #46

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/46/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 98453a0da534be0bc632bfd017377062a74d59de (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 98453a0da534be0bc632bfd017377062a74d59de
Commit message: "Merge pull request #8493: [BEAM-6966] Spark portable runner: translate READ"
 > git rev-list --no-walk 98453a0da534be0bc632bfd017377062a74d59de # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8275588876214224732.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2564377974812480771.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8271485300690001846.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins5752972783151705490.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2895495003968114885.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6240199725858947097.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3527218257073322995.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-05 18:43:22,762 1739fecd MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/1739fecd/pkb.log>
2019-05-05 18:43:22,762 1739fecd MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-05 18:43:22,763 1739fecd MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-05 18:43:22,991 1739fecd MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-05 18:43:23,013 1739fecd MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-05 18:43:23,035 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-05 18:43:23,038 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-05 18:43:23,038 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-05 18:43:23,049 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-05 18:43:25,058 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.889 secs.
The client will now receive all logging from the daemon (pid: 3432). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-3432.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-05 18:43:25,059 1739fecd MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-05 18:43:25,059 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-05 18:43:25,061 1739fecd MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-05 18:43:25,061 1739fecd MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-05 18:43:25,062 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-05 18:43:25,062 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/1739fecd/pkb.log>
2019-05-05 18:43:25,062 1739fecd MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/1739fecd/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #45

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/45/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 98453a0da534be0bc632bfd017377062a74d59de (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 98453a0da534be0bc632bfd017377062a74d59de
Commit message: "Merge pull request #8493: [BEAM-6966] Spark portable runner: translate READ"
 > git rev-list --no-walk 98453a0da534be0bc632bfd017377062a74d59de # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1305394030321297096.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2822027743123131933.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6283632485440179878.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins7542616071710895427.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2862986645583218187.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4306745342252041705.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4346701123787884046.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-05 12:43:21,814 08149ab3 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/08149ab3/pkb.log>
2019-05-05 12:43:21,815 08149ab3 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-05 12:43:21,816 08149ab3 MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-05 12:43:22,089 08149ab3 MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-05 12:43:22,110 08149ab3 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-05 12:43:22,134 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-05 12:43:22,136 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-05 12:43:22,136 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-05 12:43:22,146 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-05 12:43:24,150 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.907 secs.
The client will now receive all logging from the daemon (pid: 2850). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-2850.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-05 12:43:24,151 08149ab3 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-05 12:43:24,152 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-05 12:43:24,153 08149ab3 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-05 12:43:24,154 08149ab3 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-05 12:43:24,154 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-05 12:43:24,154 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/08149ab3/pkb.log>
2019-05-05 12:43:24,154 08149ab3 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/08149ab3/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #44

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/44/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 98453a0da534be0bc632bfd017377062a74d59de (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 98453a0da534be0bc632bfd017377062a74d59de
Commit message: "Merge pull request #8493: [BEAM-6966] Spark portable runner: translate READ"
 > git rev-list --no-walk 98453a0da534be0bc632bfd017377062a74d59de # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2241179979502127355.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6593095417958988548.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2151707095846797663.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4563000526225807852.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins5846786818464698207.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins9121452770945458489.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2678730896947018755.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-05 06:43:21,990 b64815b0 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/b64815b0/pkb.log>
2019-05-05 06:43:21,991 b64815b0 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-05 06:43:21,992 b64815b0 MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-05 06:43:22,452 b64815b0 MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-05 06:43:22,474 b64815b0 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-05 06:43:22,496 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-05 06:43:22,498 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-05 06:43:22,498 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-05 06:43:22,508 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-05 06:43:24,463 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.872 secs.
The client will now receive all logging from the daemon (pid: 20827). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-20827.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-05 06:43:24,464 b64815b0 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-05 06:43:24,465 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-05 06:43:24,466 b64815b0 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-05 06:43:24,467 b64815b0 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-05 06:43:24,467 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-05 06:43:24,467 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/b64815b0/pkb.log>
2019-05-05 06:43:24,467 b64815b0 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/b64815b0/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #43

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/43/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 98453a0da534be0bc632bfd017377062a74d59de (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 98453a0da534be0bc632bfd017377062a74d59de
Commit message: "Merge pull request #8493: [BEAM-6966] Spark portable runner: translate READ"
 > git rev-list --no-walk 98453a0da534be0bc632bfd017377062a74d59de # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3912404117701215270.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins745354470162092020.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2301887445842801335.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins7949622411658011716.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1780366206268134741.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2567405695460454409.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1569114373109690748.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-05 00:43:22,706 4fd3e53a MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/4fd3e53a/pkb.log>
2019-05-05 00:43:22,707 4fd3e53a MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-05 00:43:22,708 4fd3e53a MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-05 00:43:23,022 4fd3e53a MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-05 00:43:23,044 4fd3e53a MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-05 00:43:23,065 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-05 00:43:23,068 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-05 00:43:23,068 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-05 00:43:23,078 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-05 00:43:25,104 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.915 secs.
The client will now receive all logging from the daemon (pid: 29204). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-29204.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-05 00:43:25,105 4fd3e53a MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-05 00:43:25,106 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-05 00:43:25,108 4fd3e53a MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-05 00:43:25,109 4fd3e53a MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-05 00:43:25,109 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-05 00:43:25,109 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/4fd3e53a/pkb.log>
2019-05-05 00:43:25,109 4fd3e53a MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/4fd3e53a/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #42

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/42/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 98453a0da534be0bc632bfd017377062a74d59de (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 98453a0da534be0bc632bfd017377062a74d59de
Commit message: "Merge pull request #8493: [BEAM-6966] Spark portable runner: translate READ"
 > git rev-list --no-walk 98453a0da534be0bc632bfd017377062a74d59de # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1016598105550809892.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4134421079906705301.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3641175590253304080.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3418527104753782790.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3618614344373959940.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1576670702549675324.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins5832504566380837440.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-04 18:43:23,107 660923e7 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/660923e7/pkb.log>
2019-05-04 18:43:23,108 660923e7 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-04 18:43:23,110 660923e7 MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-04 18:43:23,523 660923e7 MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-04 18:43:23,545 660923e7 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-04 18:43:23,567 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-04 18:43:23,569 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-04 18:43:23,570 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-04 18:43:23,580 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-04 18:43:25,545 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon (subsequent builds will be faster)
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.883 secs.
The client will now receive all logging from the daemon (pid: 30114). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-30114.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-04 18:43:25,546 660923e7 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-04 18:43:25,547 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-04 18:43:25,549 660923e7 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-04 18:43:25,549 660923e7 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-04 18:43:25,549 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-04 18:43:25,549 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/660923e7/pkb.log>
2019-05-04 18:43:25,549 660923e7 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/660923e7/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #41

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/41/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6966] Spark portable runner: translate READ

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 98453a0da534be0bc632bfd017377062a74d59de (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 98453a0da534be0bc632bfd017377062a74d59de
Commit message: "Merge pull request #8493: [BEAM-6966] Spark portable runner: translate READ"
 > git rev-list --no-walk c01b0121b23ea19640ba3d8169f21d76de66440d # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8758864106463442666.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins7822280502459903921.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins1771751155649475805.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins9048289311599085106.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins8570396789499748392.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins5655810665093925954.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4839386688886017434.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-04 12:43:28,037 149d6798 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/149d6798/pkb.log>
2019-05-04 12:43:28,038 149d6798 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-04 12:43:28,039 149d6798 MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-04 12:43:28,377 149d6798 MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-04 12:43:28,398 149d6798 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-04 12:43:28,419 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-04 12:43:28,421 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-04 12:43:28,421 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-04 12:43:28,429 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-04 12:43:30,563 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.855 secs.
The client will now receive all logging from the daemon (pid: 26991). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-26991.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-04 12:43:30,564 149d6798 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-04 12:43:30,565 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-04 12:43:30,566 149d6798 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-04 12:43:30,567 149d6798 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-04 12:43:30,567 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-04 12:43:30,567 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/149d6798/pkb.log>
2019-05-04 12:43:30,567 149d6798 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/149d6798/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #40

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/40/display/redirect?page=changes>

Changes:

[yoshiki.obata] [BEAM-7137] encode header to bytes when writing to file at

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c01b0121b23ea19640ba3d8169f21d76de66440d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c01b0121b23ea19640ba3d8169f21d76de66440d
Commit message: "Merge pull request #8452 from lazylynx/writetotext-header-encode"
 > git rev-list --no-walk 6992fd0ad7de1241c6b1c08b4c7568e560e0b8f3 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6974045520712284816.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4160121323888621977.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins4263799418485742442.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins7821516798289722360.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3377460065745891471.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6760542394105754652.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3253757744505025539.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-04 06:43:24,321 6ad0acc7 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/6ad0acc7/pkb.log>
2019-05-04 06:43:24,333 6ad0acc7 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-04 06:43:24,335 6ad0acc7 MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-04 06:43:24,645 6ad0acc7 MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-04 06:43:24,668 6ad0acc7 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-04 06:43:24,690 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-04 06:43:24,693 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-04 06:43:24,693 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-04 06:43:24,703 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-04 06:43:26,725 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.931 secs.
The client will now receive all logging from the daemon (pid: 20448). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-20448.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-04 06:43:26,726 6ad0acc7 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-04 06:43:26,727 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-04 06:43:26,729 6ad0acc7 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-04 06:43:26,729 6ad0acc7 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-04 06:43:26,730 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-04 06:43:26,730 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/6ad0acc7/pkb.log>
2019-05-04 06:43:26,730 6ad0acc7 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/6ad0acc7/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Python35 #39

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Python35/39/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-6605] Deprecate TextIO.readAll and TextIO.ReadAll transform

[iemejia] [BEAM-6605] Refactor TextIO.Read and its Tests to use FileIO + ReadFiles

[iemejia] [BEAM-6606] Deprecate AvroIO ReadAll and ParseAll transforms

[pabloem] [BEAM-2939] Initial SyntheticSDF as Source and add an Synthetic pipeline

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6992fd0ad7de1241c6b1c08b4c7568e560e0b8f3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6992fd0ad7de1241c6b1c08b4c7568e560e0b8f3
Commit message: "[BEAM-2939] Initial SyntheticSDF as Source and add an Synthetic pipeline to sdf test (#8338)"
 > git rev-list --no-walk 896402425b36b54e2cf00249fec2d84ec86c7e63 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6216535490637520959.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins5591520803966156576.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3265110914763055821.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins6137627012633725862.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1)
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins3191549047283819325.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins2964728242512288152.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/01/11/525b02e4acc0c747de8b6ccdab376331597c569c42ea66ab0a1dbd36eca2/urllib3-1.24.3-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.3 xmltodict-0.12.0
[beam_PerformanceTests_Python35] $ /bin/bash -xe /tmp/jenkins5703704538306301413.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.wordcount_py35_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src> --official=true --beam_sdk=python --benchmarks=beam_integration_benchmark --beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it --beam_it_module=sdks/python/test-suites/dataflow/py3 --beam_prebuilt=true --beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz --beam_runner=TestDataflowRunner --beam_it_timeout=1200 --beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
2019-05-04 00:43:25,563 d5aa031f MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/d5aa031f/pkb.log>
2019-05-04 00:43:25,564 d5aa031f MainThread INFO     PerfKitBenchmarker version: v1.12.0-1162-g62923c5
2019-05-04 00:43:25,565 d5aa031f MainThread INFO     Flag values:
--beam_it_class=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it
--beam_it_timeout=1200
--beam_it_module=sdks/python/test-suites/dataflow/py3
--beam_sdk=python
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.wordcount_py35_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/>
--beam_it_args=--project=apache-beam-testing,--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it,--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it,--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
--beam_prebuilt
--project=apache-beam-testing
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src>
--beam_runner=TestDataflowRunner
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
--beam_python_sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz
2019-05-04 00:43:25,749 d5aa031f MainThread INFO     Setting --max_concurrent_threads=200.
2019-05-04 00:43:25,770 d5aa031f MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-05-04 00:43:25,792 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-05-04 00:43:25,794 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-05-04 00:43:25,794 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-05-04 00:43:25,805 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan
2019-05-04 00:43:27,834 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/gradlew> integrationTest -Dtests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it -p sdks/python/test-suites/dataflow/py3 -Dattr=IT -DpipelineOptions="--project=apache-beam-testing" "--staging_location=gs://temp-storage-for-end-to-end-tests/staging-it" "--temp_location=gs://temp-storage-for-end-to-end-tests/temp-it" "--output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output" "--runner=TestDataflowRunner" "--sdk_location=test-suites/dataflow/py3/build/apache-beam.tar.gz" --info --scan}  ReturnCode:1
STDOUT: Initialized native services in: /home/jenkins/.gradle/native
Removing 0 daemon stop events from registry
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
Starting process 'Gradle build daemon'. Working directory: /home/jenkins/.gradle/daemon/5.2.1 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -Xss10240k -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -cp /home/jenkins/.gradle/wrapper/dists/gradle-5.2.1-all/bviwmvmbexq6idcscbicws5me/gradle-5.2.1/lib/gradle-launcher-5.2.1.jar org.gradle.launcher.daemon.bootstrap.GradleDaemon 5.2.1
Successfully started process 'Gradle build daemon'
An attempt to start the daemon took 0.897 secs.
The client will now receive all logging from the daemon (pid: 31655). The daemon log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-31655.out.log
Starting build in new daemon [memory: 24.4 GB]

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
The specified project directory '<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/src/sdks/python/test-suites/dataflow/py3'> does not exist.

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s

2019-05-04 00:43:27,835 d5aa031f MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-04 00:43:27,836 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-04 00:43:27,838 d5aa031f MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-04 00:43:27,838 d5aa031f MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-04 00:43:27,839 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-04 00:43:27,839 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/d5aa031f/pkb.log>
2019-05-04 00:43:27,839 d5aa031f MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Python35/ws/runs/d5aa031f/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org