You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/29 12:19:19 UTC

Build failed in Jenkins: beam_PerformanceTests_Spark #3340

See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3340/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-6692] portable Spark: reshuffle translation

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk 30fa5d8dc89f9ea7b638c573fe076d41898d6d23 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6139304570578790116.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6529662627033800789.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5137678052860659457.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins398402578712064856.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2956667107972752240.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins941860627574464461.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2056711656100495148.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-06-29 12:19:15,985 59675e86 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/59675e86/pkb.log>
2019-06-29 12:19:15,986 59675e86 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-06-29 12:19:15,987 59675e86 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-06-29 12:19:16,203 59675e86 MainThread INFO     Setting --max_concurrent_threads=200.
2019-06-29 12:19:16,227 59675e86 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-06-29 12:19:16,252 59675e86 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-06-29 12:19:16,255 59675e86 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-29 12:19:16,256 59675e86 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-59675e86 --format json --quiet --project apache-beam-testing
2019-06-29 12:19:17,766 59675e86 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-59675e86 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-59675e86

2019-06-29 12:19:17,767 59675e86 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-59675e86 --format json --quiet --project apache-beam-testing
2019-06-29 12:19:18,392 59675e86 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-59675e86 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-59675e86

2019-06-29 12:19:18,395 59675e86 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-29 12:19:18,395 59675e86 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-06-29 12:19:18,395 59675e86 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-06-29 12:19:18,396 59675e86 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/59675e86/pkb.log>
2019-06-29 12:19:18,396 59675e86 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/59675e86/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3436

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3436/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7502] Create ParDo Python Load Test Jenkins job

[kamil.wasilewski] [BEAM-7502] Renamed file with Python GBK Load Test job definition

[kamil.wasilewski] [BEAM-7502] Reduced number of iterations to 1 in Java ParDo job

[robertwb] Temporary workaround for [BEAM-7473] (#9023)

[robertwb] Make SDFBoundedSource wrapper work with dynamic splitting (#8944)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 75875def7098dec8fcab89941ee398a34fbf5fb1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 75875def7098dec8fcab89941ee398a34fbf5fb1
Commit message: "Make SDFBoundedSource wrapper work with dynamic splitting (#8944)"
 > git rev-list --no-walk 8bd3b50715029341e21577ab086301b415c844b3 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7816508980521095289.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3304481152223196546.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8536260703965235402.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins75147481523039180.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.2)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9001540459804095277.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7583100413135822610.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6700992760232883728.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-23 13:08:34,954 2bd5db40 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2bd5db40/pkb.log>
2019-07-23 13:08:34,954 2bd5db40 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1281-g37855a6
2019-07-23 13:08:34,956 2bd5db40 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-23 13:08:35,146 2bd5db40 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-23 13:08:35,172 2bd5db40 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-23 13:08:35,197 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-23 13:08:35,200 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-23 13:08:35,201 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-2bd5db40 --format json --quiet --project apache-beam-testing
2019-07-23 13:08:36,786 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-2bd5db40 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-2bd5db40

2019-07-23 13:08:36,787 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-2bd5db40 --format json --quiet --project apache-beam-testing
2019-07-23 13:08:37,399 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-2bd5db40 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-2bd5db40

2019-07-23 13:08:37,401 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-23 13:08:37,402 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-23 13:08:37,402 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-23 13:08:37,402 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2bd5db40/pkb.log>
2019-07-23 13:08:37,403 2bd5db40 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2bd5db40/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3435

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3435/display/redirect?page=changes>

Changes:

[github] Revert "[BEAM-7060] Migrate to native typing types where possible."

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-3 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8bd3b50715029341e21577ab086301b415c844b3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8bd3b50715029341e21577ab086301b415c844b3
Commit message: "Merge pull request #9122: [BEAM-7798] Revert "[BEAM-7060] Migrate to native typing types where possible.""
 > git rev-list --no-walk 6518abfb3ea47a4802d76ca3c405c3f66e48eaa2 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6852953997964779134.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8744437162970092364.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8011027225139426616.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2006749027883596462.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.2)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7646063278607208206.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1744263209318418073.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7226007982431901378.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-23 06:52:17,533 0f72a25e MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0f72a25e/pkb.log>
2019-07-23 06:52:17,533 0f72a25e MainThread INFO     PerfKitBenchmarker version: v1.12.0-1281-g37855a6
2019-07-23 06:52:17,534 0f72a25e MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-23 06:52:17,783 0f72a25e MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-23 06:52:17,807 0f72a25e MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-23 06:52:17,830 0f72a25e MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-23 06:52:17,833 0f72a25e MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-23 06:52:17,834 0f72a25e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-0f72a25e --format json --quiet --project apache-beam-testing
2019-07-23 06:52:19,132 0f72a25e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-0f72a25e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0f72a25e

2019-07-23 06:52:19,133 0f72a25e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-0f72a25e --format json --quiet --project apache-beam-testing
2019-07-23 06:52:19,700 0f72a25e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-0f72a25e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0f72a25e

2019-07-23 06:52:19,702 0f72a25e MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-23 06:52:19,703 0f72a25e MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-23 06:52:19,703 0f72a25e MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-23 06:52:19,703 0f72a25e MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0f72a25e/pkb.log>
2019-07-23 06:52:19,703 0f72a25e MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0f72a25e/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3434

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3434/display/redirect?page=changes>

Changes:

[github] Update Python 3 entry in Python SDK roadmap.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6518abfb3ea47a4802d76ca3c405c3f66e48eaa2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6518abfb3ea47a4802d76ca3c405c3f66e48eaa2
Commit message: "Merge pull request #9123 from tvalentyn/patch-57"
 > git rev-list --no-walk ab80cc5f031f7881b688e7fb5b05191fa6b3f80f # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4259186908169704807.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3279836306694721285.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1997446001854936900.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9212070207014343603.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins20212359493051442.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins561240374081327264.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5838181423510418338.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-23 00:48:20,002 71a6ac69 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/71a6ac69/pkb.log>
2019-07-23 00:48:20,002 71a6ac69 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1281-g37855a6
2019-07-23 00:48:20,004 71a6ac69 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-23 00:48:20,397 71a6ac69 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-23 00:48:20,422 71a6ac69 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-23 00:48:20,444 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-23 00:48:20,446 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-23 00:48:20,447 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-71a6ac69 --format json --quiet --project apache-beam-testing
2019-07-23 00:48:21,751 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-71a6ac69 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-71a6ac69

2019-07-23 00:48:21,752 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-71a6ac69 --format json --quiet --project apache-beam-testing
2019-07-23 00:48:22,345 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-71a6ac69 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-71a6ac69

2019-07-23 00:48:22,348 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-23 00:48:22,349 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-23 00:48:22,349 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-23 00:48:22,349 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/71a6ac69/pkb.log>
2019-07-23 00:48:22,349 71a6ac69 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/71a6ac69/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3433

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3433/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-7679] Add randomness to ITs' BQ dataset name

[valentyn] Default to PiplelineState.UNKNOWN when job state returned from v1beta3

[valentyn] fixup: Address review feedback.

[alireza4263] [BEAM-7783] Adding BeamTableStatistics.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ab80cc5f031f7881b688e7fb5b05191fa6b3f80f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ab80cc5f031f7881b688e7fb5b05191fa6b3f80f
Commit message: "Merge pull request #9094 from tvalentyn/default_to_unknown"
 > git rev-list --no-walk f6367cc1a4565060e5f1bf652ad5725ac5b43aa6 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1730438132268903034.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1376792054865780927.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1243499390997047872.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6647948917388980385.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7912881136534657738.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1119807227650320769.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4628501489989371220.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-22 18:45:49,616 614801da MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/614801da/pkb.log>
2019-07-22 18:45:49,617 614801da MainThread INFO     PerfKitBenchmarker version: v1.12.0-1281-g37855a6
2019-07-22 18:45:49,618 614801da MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-22 18:45:49,814 614801da MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-22 18:45:49,839 614801da MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-22 18:45:49,860 614801da MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-22 18:45:49,863 614801da MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-22 18:45:49,864 614801da MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-614801da --format json --quiet --project apache-beam-testing
2019-07-22 18:45:50,415 614801da MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-614801da --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-614801da

2019-07-22 18:45:50,416 614801da MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-614801da --format json --quiet --project apache-beam-testing
2019-07-22 18:45:50,954 614801da MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-614801da --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-614801da

2019-07-22 18:45:50,957 614801da MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-22 18:45:50,957 614801da MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-22 18:45:50,958 614801da MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-22 18:45:50,958 614801da MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/614801da/pkb.log>
2019-07-22 18:45:50,958 614801da MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/614801da/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3432

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3432/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-7060] Support translation of native type variables.

[robertwb] [BEAM-7060] Translate the typing.Iterable type hint.

[robertwb] Minor fixes discovered in migration.

[robertwb] [BEAM-7060] Automated replace of typehints with typing.

[github] Remove 4 empty spaces from PR template that mess up Python postcommit

[github] Relax pydot requirements.

[ttanay100] Replace old badges for Python PostCommit test with split ones

[robertwb] Post-rewrite lint fixes.

[robertwb] Manual fixes for over-agressive replace.

[robertwb] More conservative typing module translation.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f6367cc1a4565060e5f1bf652ad5725ac5b43aa6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f6367cc1a4565060e5f1bf652ad5725ac5b43aa6
Commit message: "Merge pull request #9112 Fix PR template."
 > git rev-list --no-walk 94a3c8542d61de52aca112d2373c399bc8826611 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4293376791346619439.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1606622952139878078.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1413672346415840002.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6984975938212296604.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3029791422350107178.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1888747577340384511.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins450012203621938708.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-22 12:54:38,900 fb6390be MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/fb6390be/pkb.log>
2019-07-22 12:54:38,901 fb6390be MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-22 12:54:38,902 fb6390be MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-22 12:54:39,145 fb6390be MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-22 12:54:39,170 fb6390be MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-22 12:54:39,191 fb6390be MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-22 12:54:39,193 fb6390be MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-22 12:54:39,194 fb6390be MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-fb6390be --format json --quiet --project apache-beam-testing
2019-07-22 12:54:40,481 fb6390be MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-fb6390be --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-fb6390be

2019-07-22 12:54:40,482 fb6390be MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-fb6390be --format json --quiet --project apache-beam-testing
2019-07-22 12:54:41,049 fb6390be MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-fb6390be --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-fb6390be

2019-07-22 12:54:41,052 fb6390be MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-22 12:54:41,052 fb6390be MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-22 12:54:41,052 fb6390be MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-22 12:54:41,052 fb6390be MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/fb6390be/pkb.log>
2019-07-22 12:54:41,053 fb6390be MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/fb6390be/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3431

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3431/display/redirect?page=changes>

Changes:

[jeff] [BEAM-5191] Support for BigQuery clustering

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 94a3c8542d61de52aca112d2373c399bc8826611 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 94a3c8542d61de52aca112d2373c399bc8826611
Commit message: "Merge pull request #8945: [BEAM-5191] Support for BigQuery clustering"
 > git rev-list --no-walk 23a879f5044bc42a63ceb1394d59d9927bfde79a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2940464835542314584.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6999586087545279436.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7363945173911211610.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8604200429967634066.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1345571446566328046.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2919348776127194003.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins190972092589910117.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-22 06:48:35,650 a8904904 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a8904904/pkb.log>
2019-07-22 06:48:35,650 a8904904 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-22 06:48:35,652 a8904904 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-22 06:48:35,800 a8904904 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-22 06:48:35,826 a8904904 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-22 06:48:35,850 a8904904 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-22 06:48:35,853 a8904904 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-22 06:48:35,854 a8904904 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-a8904904 --format json --quiet --project apache-beam-testing
2019-07-22 06:48:37,279 a8904904 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-a8904904 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-a8904904

2019-07-22 06:48:37,280 a8904904 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-a8904904 --format json --quiet --project apache-beam-testing
2019-07-22 06:48:38,013 a8904904 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-a8904904 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-a8904904

2019-07-22 06:48:38,016 a8904904 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-22 06:48:38,017 a8904904 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-22 06:48:38,017 a8904904 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-22 06:48:38,017 a8904904 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a8904904/pkb.log>
2019-07-22 06:48:38,017 a8904904 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a8904904/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3430

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3430/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-8 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 23a879f5044bc42a63ceb1394d59d9927bfde79a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 23a879f5044bc42a63ceb1394d59d9927bfde79a
Commit message: "Merge pull request #9110: Revert incorrect doc on flink runner version"
 > git rev-list --no-walk 23a879f5044bc42a63ceb1394d59d9927bfde79a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1406523206245772703.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins56637496404703754.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins865124049235704921.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4635363221695075153.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5657970414909467312.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins707508495117111613.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2396956663803172974.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-22 00:49:06,977 8117c450 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8117c450/pkb.log>
2019-07-22 00:49:06,978 8117c450 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-22 00:49:06,979 8117c450 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-22 00:49:07,322 8117c450 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-22 00:49:07,346 8117c450 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-22 00:49:07,369 8117c450 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-22 00:49:07,372 8117c450 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-22 00:49:07,373 8117c450 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-8117c450 --format json --quiet --project apache-beam-testing
2019-07-22 00:49:07,924 8117c450 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-8117c450 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-8117c450

2019-07-22 00:49:07,925 8117c450 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-8117c450 --format json --quiet --project apache-beam-testing
2019-07-22 00:49:08,419 8117c450 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-8117c450 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-8117c450

2019-07-22 00:49:08,421 8117c450 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-22 00:49:08,422 8117c450 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-22 00:49:08,422 8117c450 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-22 00:49:08,422 8117c450 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8117c450/pkb.log>
2019-07-22 00:49:08,422 8117c450 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8117c450/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3429

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3429/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 23a879f5044bc42a63ceb1394d59d9927bfde79a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 23a879f5044bc42a63ceb1394d59d9927bfde79a
Commit message: "Merge pull request #9110: Revert incorrect doc on flink runner version"
 > git rev-list --no-walk 23a879f5044bc42a63ceb1394d59d9927bfde79a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6722964519691206793.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4320220335619563998.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3121103530911013937.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3393608576610298059.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8344267181722012732.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5578671507543295191.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4047845318087150433.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-21 18:52:30,445 62f66cf1 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/62f66cf1/pkb.log>
2019-07-21 18:52:30,446 62f66cf1 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-21 18:52:30,447 62f66cf1 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-21 18:52:30,736 62f66cf1 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-21 18:52:30,761 62f66cf1 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-21 18:52:30,785 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-21 18:52:30,787 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-21 18:52:30,789 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-62f66cf1 --format json --quiet --project apache-beam-testing
2019-07-21 18:52:31,955 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-62f66cf1 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-62f66cf1

2019-07-21 18:52:31,956 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-62f66cf1 --format json --quiet --project apache-beam-testing
2019-07-21 18:52:32,559 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-62f66cf1 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-62f66cf1

2019-07-21 18:52:32,562 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-21 18:52:32,562 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-21 18:52:32,563 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-21 18:52:32,563 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/62f66cf1/pkb.log>
2019-07-21 18:52:32,563 62f66cf1 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/62f66cf1/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3428

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3428/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-8 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 23a879f5044bc42a63ceb1394d59d9927bfde79a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 23a879f5044bc42a63ceb1394d59d9927bfde79a
Commit message: "Merge pull request #9110: Revert incorrect doc on flink runner version"
 > git rev-list --no-walk 23a879f5044bc42a63ceb1394d59d9927bfde79a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8142974945865736105.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1378397082173060700.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4835264076562186412.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7343911608268956553.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5541763940659404269.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8279671329965122736.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3760926725808998077.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-21 12:56:33,255 8a3ee075 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8a3ee075/pkb.log>
2019-07-21 12:56:33,256 8a3ee075 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-21 12:56:33,257 8a3ee075 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-21 12:56:33,491 8a3ee075 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-21 12:56:33,516 8a3ee075 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-21 12:56:33,540 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-21 12:56:33,543 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-21 12:56:33,544 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-8a3ee075 --format json --quiet --project apache-beam-testing
2019-07-21 12:56:34,137 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-8a3ee075 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-8a3ee075

2019-07-21 12:56:34,138 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-8a3ee075 --format json --quiet --project apache-beam-testing
2019-07-21 12:56:34,684 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-8a3ee075 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-8a3ee075

2019-07-21 12:56:34,687 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-21 12:56:34,687 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-21 12:56:34,688 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-21 12:56:34,688 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8a3ee075/pkb.log>
2019-07-21 12:56:34,688 8a3ee075 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8a3ee075/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3427

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3427/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-8 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 23a879f5044bc42a63ceb1394d59d9927bfde79a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 23a879f5044bc42a63ceb1394d59d9927bfde79a
Commit message: "Merge pull request #9110: Revert incorrect doc on flink runner version"
 > git rev-list --no-walk 23a879f5044bc42a63ceb1394d59d9927bfde79a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3959063402332216103.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins281343852280671778.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4046651702556972859.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7652414165731718510.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2024947320752458308.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6847022000095367898.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3637317605579028128.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-21 06:49:24,575 7a7d09bb MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7a7d09bb/pkb.log>
2019-07-21 06:49:24,575 7a7d09bb MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-21 06:49:24,576 7a7d09bb MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-21 06:49:24,774 7a7d09bb MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-21 06:49:24,799 7a7d09bb MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-21 06:49:24,821 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-21 06:49:24,824 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-21 06:49:24,825 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-7a7d09bb --format json --quiet --project apache-beam-testing
2019-07-21 06:49:25,391 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-7a7d09bb --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-7a7d09bb

2019-07-21 06:49:25,392 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-7a7d09bb --format json --quiet --project apache-beam-testing
2019-07-21 06:49:25,943 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-7a7d09bb --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-7a7d09bb

2019-07-21 06:49:25,945 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-21 06:49:25,946 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-21 06:49:25,946 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-21 06:49:25,946 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7a7d09bb/pkb.log>
2019-07-21 06:49:25,946 7a7d09bb MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7a7d09bb/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3426

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3426/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 23a879f5044bc42a63ceb1394d59d9927bfde79a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 23a879f5044bc42a63ceb1394d59d9927bfde79a
Commit message: "Merge pull request #9110: Revert incorrect doc on flink runner version"
 > git rev-list --no-walk 23a879f5044bc42a63ceb1394d59d9927bfde79a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4079512487417046509.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3090109473125741297.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8247455669274071777.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2688136068580723303.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2309213221326154140.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2951048389088558060.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5377030605133741447.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-21 00:49:58,031 561b9ffb MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/561b9ffb/pkb.log>
2019-07-21 00:49:58,032 561b9ffb MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-21 00:49:58,033 561b9ffb MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-21 00:49:58,273 561b9ffb MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-21 00:49:58,297 561b9ffb MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-21 00:49:58,317 561b9ffb MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-21 00:49:58,319 561b9ffb MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-21 00:49:58,320 561b9ffb MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-561b9ffb --format json --quiet --project apache-beam-testing
2019-07-21 00:49:58,859 561b9ffb MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-561b9ffb --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-561b9ffb

2019-07-21 00:49:58,860 561b9ffb MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-561b9ffb --format json --quiet --project apache-beam-testing
2019-07-21 00:49:59,392 561b9ffb MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-561b9ffb --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-561b9ffb

2019-07-21 00:49:59,394 561b9ffb MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-21 00:49:59,395 561b9ffb MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-21 00:49:59,395 561b9ffb MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-21 00:49:59,395 561b9ffb MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/561b9ffb/pkb.log>
2019-07-21 00:49:59,395 561b9ffb MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/561b9ffb/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3425

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3425/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 23a879f5044bc42a63ceb1394d59d9927bfde79a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 23a879f5044bc42a63ceb1394d59d9927bfde79a
Commit message: "Merge pull request #9110: Revert incorrect doc on flink runner version"
 > git rev-list --no-walk 23a879f5044bc42a63ceb1394d59d9927bfde79a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1811883156885188348.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1272235367340771393.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4487332367692462105.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6022993908482778929.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4628269764480632249.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4942460657502189170.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7112602765956345136.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-20 18:49:55,504 338e184c MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/338e184c/pkb.log>
2019-07-20 18:49:55,505 338e184c MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-20 18:49:55,506 338e184c MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-20 18:49:55,748 338e184c MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-20 18:49:55,774 338e184c MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-20 18:49:55,797 338e184c MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-20 18:49:55,800 338e184c MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-20 18:49:55,801 338e184c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-338e184c --format json --quiet --project apache-beam-testing
2019-07-20 18:49:56,411 338e184c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-338e184c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-338e184c

2019-07-20 18:49:56,412 338e184c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-338e184c --format json --quiet --project apache-beam-testing
2019-07-20 18:49:56,953 338e184c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-338e184c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-338e184c

2019-07-20 18:49:56,956 338e184c MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-20 18:49:56,957 338e184c MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-20 18:49:56,957 338e184c MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-20 18:49:56,957 338e184c MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/338e184c/pkb.log>
2019-07-20 18:49:56,957 338e184c MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/338e184c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3424

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3424/display/redirect?page=changes>

Changes:

[robertwb] Revert incorrect doc on flink runner version.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-8 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 23a879f5044bc42a63ceb1394d59d9927bfde79a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 23a879f5044bc42a63ceb1394d59d9927bfde79a
Commit message: "Merge pull request #9110: Revert incorrect doc on flink runner version"
 > git rev-list --no-walk 7fe54a0e178acf6957a797ac17edc4ec74e4bd42 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7103919450016702717.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2746472134853492555.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins34832169008022029.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8377673244113237935.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8911472356279416618.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4097524171150684422.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1536968777353436902.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-20 12:57:13,667 9cbeee74 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9cbeee74/pkb.log>
2019-07-20 12:57:13,668 9cbeee74 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-20 12:57:13,669 9cbeee74 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-20 12:57:13,903 9cbeee74 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-20 12:57:13,928 9cbeee74 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-20 12:57:13,951 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-20 12:57:13,954 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-20 12:57:13,955 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-9cbeee74 --format json --quiet --project apache-beam-testing
2019-07-20 12:57:14,524 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-9cbeee74 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-9cbeee74

2019-07-20 12:57:14,525 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-9cbeee74 --format json --quiet --project apache-beam-testing
2019-07-20 12:57:15,106 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-9cbeee74 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-9cbeee74

2019-07-20 12:57:15,109 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-20 12:57:15,109 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-20 12:57:15,110 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-20 12:57:15,110 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9cbeee74/pkb.log>
2019-07-20 12:57:15,110 9cbeee74 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9cbeee74/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3423

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3423/display/redirect?page=changes>

Changes:

[zyichi] Add transform_name_mapping pipeline option for python sdk

[dcavazos] Skip DoFn params test in Python 2 on Windows

[zyichi] Check streaming option when validate transform_name_mapping

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-8 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7fe54a0e178acf6957a797ac17edc4ec74e4bd42 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7fe54a0e178acf6957a797ac17edc4ec74e4bd42
Commit message: "Merge pull request #9072 from y1chi/transform_name_mapping"
 > git rev-list --no-walk 2bbf6eb1f65f49625d2f17b2703ec1e774f8c85f # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins668460443373361553.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5828300022222557668.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4047978874298097322.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3301917965626597031.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4529846410600227246.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7225169610359440315.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7727980433484177214.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-20 06:43:33,377 48db2b81 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/48db2b81/pkb.log>
2019-07-20 06:43:33,377 48db2b81 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-20 06:43:33,379 48db2b81 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-20 06:43:33,607 48db2b81 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-20 06:43:33,632 48db2b81 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-20 06:43:33,655 48db2b81 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-20 06:43:33,658 48db2b81 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-20 06:43:33,659 48db2b81 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-48db2b81 --format json --quiet --project apache-beam-testing
2019-07-20 06:43:35,091 48db2b81 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-48db2b81 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-48db2b81

2019-07-20 06:43:35,092 48db2b81 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-48db2b81 --format json --quiet --project apache-beam-testing
2019-07-20 06:43:35,695 48db2b81 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-48db2b81 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-48db2b81

2019-07-20 06:43:35,698 48db2b81 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-20 06:43:35,699 48db2b81 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-20 06:43:35,699 48db2b81 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-20 06:43:35,699 48db2b81 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/48db2b81/pkb.log>
2019-07-20 06:43:35,700 48db2b81 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/48db2b81/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3422

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3422/display/redirect?page=changes>

Changes:

[valentyn] Split Python 3 Postcommits into several jobs

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2bbf6eb1f65f49625d2f17b2703ec1e774f8c85f (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2bbf6eb1f65f49625d2f17b2703ec1e774f8c85f
Commit message: "Merge pull request #9093: [BEAM-7714] [BEAM-7257] Split Python 3 postcommits into several Jenkins jobs."
 > git rev-list --no-walk 9589835073cd0908a47c4c0c50b8a17925a300e9 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4879025440320074590.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4721426563181004009.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6572658760024305250.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7959548846477752284.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4952008272678293666.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8909571230720592192.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2127012015474683489.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-20 00:48:56,081 3af3a05f MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3af3a05f/pkb.log>
2019-07-20 00:48:56,082 3af3a05f MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-20 00:48:56,083 3af3a05f MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-20 00:48:56,409 3af3a05f MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-20 00:48:56,433 3af3a05f MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-20 00:48:56,456 3af3a05f MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-20 00:48:56,458 3af3a05f MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-20 00:48:56,459 3af3a05f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-3af3a05f --format json --quiet --project apache-beam-testing
2019-07-20 00:48:57,989 3af3a05f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-3af3a05f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-3af3a05f

2019-07-20 00:48:57,990 3af3a05f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-3af3a05f --format json --quiet --project apache-beam-testing
2019-07-20 00:48:58,592 3af3a05f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-3af3a05f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-3af3a05f

2019-07-20 00:48:58,595 3af3a05f MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-20 00:48:58,596 3af3a05f MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-20 00:48:58,596 3af3a05f MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-20 00:48:58,596 3af3a05f MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3af3a05f/pkb.log>
2019-07-20 00:48:58,596 3af3a05f MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3af3a05f/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3421

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3421/display/redirect?page=changes>

Changes:

[robertwb] Refactor portable JobService to allow better sharing of code.

[robertwb] Simplify known runner parsing code.

[robertwb] [BEAM-7722] Add a Python FlinkRunner that fetches and uses released

[robertwb] Pull out generic java job server helpers from flink.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9589835073cd0908a47c4c0c50b8a17925a300e9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9589835073cd0908a47c4c0c50b8a17925a300e9
Commit message: "Merge pull request #9043 [BEAM-7722] Python FlinkRunner fetching released artifacts."
 > git rev-list --no-walk 35c30fe611d47a21078293bb97bc6bcdf4007e1a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8805669388618744565.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8186214085038300163.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6690383023954839622.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6732629565704494285.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8049095910429134535.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2078292841522074253.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3312167475658493054.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-19 18:42:08,644 e75a4f31 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/e75a4f31/pkb.log>
2019-07-19 18:42:08,644 e75a4f31 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1280-g0beaa77
2019-07-19 18:42:08,646 e75a4f31 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-19 18:42:08,957 e75a4f31 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-19 18:42:08,982 e75a4f31 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-19 18:42:09,003 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-19 18:42:09,005 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-19 18:42:09,006 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-e75a4f31 --format json --quiet --project apache-beam-testing
2019-07-19 18:42:09,704 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-e75a4f31 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-e75a4f31

2019-07-19 18:42:09,704 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-e75a4f31 --format json --quiet --project apache-beam-testing
2019-07-19 18:42:10,263 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-e75a4f31 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-e75a4f31

2019-07-19 18:42:10,266 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-19 18:42:10,267 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-19 18:42:10,267 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-19 18:42:10,267 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/e75a4f31/pkb.log>
2019-07-19 18:42:10,267 e75a4f31 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/e75a4f31/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3420

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3420/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 35c30fe611d47a21078293bb97bc6bcdf4007e1a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 35c30fe611d47a21078293bb97bc6bcdf4007e1a
Commit message: "Merge pull request #9105: [BEAM-7784] Fixup for Guava upgrade"
 > git rev-list --no-walk 35c30fe611d47a21078293bb97bc6bcdf4007e1a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7310743012408949077.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5181973254182668423.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4155385263426531631.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4759790118011633416.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9101237035403969508.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins383353118672195537.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3453015160385359541.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-19 12:43:10,933 c5b6cb4e MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c5b6cb4e/pkb.log>
2019-07-19 12:43:10,934 c5b6cb4e MainThread INFO     PerfKitBenchmarker version: v1.12.0-1279-g34c438a
2019-07-19 12:43:10,935 c5b6cb4e MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-19 12:43:11,146 c5b6cb4e MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-19 12:43:11,172 c5b6cb4e MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-19 12:43:11,195 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-19 12:43:11,197 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-19 12:43:11,198 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-c5b6cb4e --format json --quiet --project apache-beam-testing
2019-07-19 12:43:12,485 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-c5b6cb4e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-c5b6cb4e

2019-07-19 12:43:12,486 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-c5b6cb4e --format json --quiet --project apache-beam-testing
2019-07-19 12:43:13,098 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-c5b6cb4e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-c5b6cb4e

2019-07-19 12:43:13,101 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-19 12:43:13,101 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-19 12:43:13,102 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-19 12:43:13,102 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c5b6cb4e/pkb.log>
2019-07-19 12:43:13,102 c5b6cb4e MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c5b6cb4e/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3419

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3419/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-7784] Fixup for Guava upgrade

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 35c30fe611d47a21078293bb97bc6bcdf4007e1a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 35c30fe611d47a21078293bb97bc6bcdf4007e1a
Commit message: "Merge pull request #9105: [BEAM-7784] Fixup for Guava upgrade"
 > git rev-list --no-walk 8c8a902cfd3bb5502c757aa32d84ebc30c768fac # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins370930641639630201.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins148437917040091858.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3663593788147505713.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5691864576795033633.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins451669996919552690.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2349344779304471705.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2738599983963560168.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-19 06:24:10,986 1e50bdeb MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1e50bdeb/pkb.log>
2019-07-19 06:24:10,987 1e50bdeb MainThread INFO     PerfKitBenchmarker version: v1.12.0-1279-g34c438a
2019-07-19 06:24:10,988 1e50bdeb MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-19 06:24:11,235 1e50bdeb MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-19 06:24:11,259 1e50bdeb MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-19 06:24:11,281 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-19 06:24:11,283 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-19 06:24:11,284 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-1e50bdeb --format json --quiet --project apache-beam-testing
2019-07-19 06:24:11,828 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-1e50bdeb --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-1e50bdeb

2019-07-19 06:24:11,829 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-1e50bdeb --format json --quiet --project apache-beam-testing
2019-07-19 06:24:12,410 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-1e50bdeb --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-1e50bdeb

2019-07-19 06:24:12,412 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-19 06:24:12,413 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-19 06:24:12,413 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-19 06:24:12,414 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1e50bdeb/pkb.log>
2019-07-19 06:24:12,414 1e50bdeb MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1e50bdeb/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3418

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3418/display/redirect?page=changes>

Changes:

[pabloem] Matching on filename, not directory for fileio

[pabloem] Lint fixup

[pabloem] Lint fixup

[github] Add retractions doc to design doc page.

[github] Add last_attempted_record_start to python OffsetRangeTracker (#9058)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8c8a902cfd3bb5502c757aa32d84ebc30c768fac (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8c8a902cfd3bb5502c757aa32d84ebc30c768fac
Commit message: "Add last_attempted_record_start to python OffsetRangeTracker (#9058)"
 > git rev-list --no-walk 9e8a564db93b38f186abdc5a4cfb801b2e4c0dfa # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6131086360855841223.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8098077397064007751.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4593854222966767490.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3965761045224104613.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4260454627774439608.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8267221275454954178.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6393572742392816234.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-19 00:35:42,669 845c38a6 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/845c38a6/pkb.log>
2019-07-19 00:35:42,670 845c38a6 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1279-g34c438a
2019-07-19 00:35:42,671 845c38a6 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-19 00:35:43,178 845c38a6 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-19 00:35:43,202 845c38a6 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-19 00:35:43,226 845c38a6 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-19 00:35:43,229 845c38a6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-19 00:35:43,230 845c38a6 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-845c38a6 --format json --quiet --project apache-beam-testing
2019-07-19 00:35:44,827 845c38a6 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-845c38a6 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-845c38a6

2019-07-19 00:35:44,828 845c38a6 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-845c38a6 --format json --quiet --project apache-beam-testing
2019-07-19 00:35:45,432 845c38a6 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-845c38a6 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-845c38a6

2019-07-19 00:35:45,434 845c38a6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-19 00:35:45,435 845c38a6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-19 00:35:45,435 845c38a6 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-19 00:35:45,435 845c38a6 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/845c38a6/pkb.log>
2019-07-19 00:35:45,435 845c38a6 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/845c38a6/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3417

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3417/display/redirect?page=changes>

Changes:

[yoshiki.obata] [BEAM-7284] enabled to pickle MappingProxyType in order to pickle

[lostluck] Fix documentation on iterable coder spec.

[lostluck] Add the ability to set the service account email for dataflow jobs

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 9e8a564db93b38f186abdc5a4cfb801b2e4c0dfa (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 9e8a564db93b38f186abdc5a4cfb801b2e4c0dfa
Commit message: "Add the ability to set the service account email for dataflow jobs"
 > git rev-list --no-walk 8656d4cec72262bca3f1c469ab422c50f1cd124a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3246652638336223007.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1629493098397460820.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7202586551751914864.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1688909909068923764.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins934371949182759805.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8571342692427181058.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins456137703067079647.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-18 18:14:30,701 3440e411 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3440e411/pkb.log>
2019-07-18 18:14:30,702 3440e411 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-18 18:14:30,703 3440e411 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-18 18:14:30,996 3440e411 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-18 18:14:31,029 3440e411 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-18 18:14:31,053 3440e411 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-18 18:14:31,056 3440e411 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-18 18:14:31,057 3440e411 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-3440e411 --format json --quiet --project apache-beam-testing
2019-07-18 18:14:31,781 3440e411 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-3440e411 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-3440e411

2019-07-18 18:14:31,782 3440e411 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-3440e411 --format json --quiet --project apache-beam-testing
2019-07-18 18:14:32,393 3440e411 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-3440e411 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-3440e411

2019-07-18 18:14:32,396 3440e411 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-18 18:14:32,396 3440e411 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-18 18:14:32,397 3440e411 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-18 18:14:32,397 3440e411 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3440e411/pkb.log>
2019-07-18 18:14:32,397 3440e411 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3440e411/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3416

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3416/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8656d4cec72262bca3f1c469ab422c50f1cd124a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8656d4cec72262bca3f1c469ab422c50f1cd124a
Commit message: "Merge pull request #9040 from riazela/JoinReordering"
 > git rev-list --no-walk 8656d4cec72262bca3f1c469ab422c50f1cd124a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5859571289102473092.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2029078958594912674.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1877321255819927822.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1122567420387448537.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5675453927550492190.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4788983576906171489.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3168627775572679846.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-18 12:40:47,395 854d3099 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/854d3099/pkb.log>
2019-07-18 12:40:47,395 854d3099 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-18 12:40:47,396 854d3099 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-18 12:40:47,626 854d3099 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-18 12:40:47,650 854d3099 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-18 12:40:47,671 854d3099 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-18 12:40:47,673 854d3099 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-18 12:40:47,674 854d3099 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-854d3099 --format json --quiet --project apache-beam-testing
2019-07-18 12:40:48,980 854d3099 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-854d3099 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-854d3099

2019-07-18 12:40:48,981 854d3099 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-854d3099 --format json --quiet --project apache-beam-testing
2019-07-18 12:40:49,474 854d3099 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-854d3099 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-854d3099

2019-07-18 12:40:49,477 854d3099 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-18 12:40:49,477 854d3099 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-18 12:40:49,477 854d3099 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-18 12:40:49,478 854d3099 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/854d3099/pkb.log>
2019-07-18 12:40:49,478 854d3099 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/854d3099/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3415

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3415/display/redirect?page=changes>

Changes:

[alireza4263] [BEAM-7545] Reordering Beam Joins and check if the produced join is

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8656d4cec72262bca3f1c469ab422c50f1cd124a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8656d4cec72262bca3f1c469ab422c50f1cd124a
Commit message: "Merge pull request #9040 from riazela/JoinReordering"
 > git rev-list --no-walk 76bc70d49aa88789bcd230ad752fcbb6ec83d4a0 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2973728960538134158.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2256742185401919561.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4635894121761795561.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6112056148518148113.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1986682800887971394.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7905878993345312452.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5141571035615973828.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-18 06:33:03,086 490499bc MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/490499bc/pkb.log>
2019-07-18 06:33:03,087 490499bc MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-18 06:33:03,090 490499bc MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-18 06:33:03,481 490499bc MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-18 06:33:03,507 490499bc MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-18 06:33:03,531 490499bc MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-18 06:33:03,533 490499bc MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-18 06:33:03,535 490499bc MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-490499bc --format json --quiet --project apache-beam-testing
2019-07-18 06:33:04,972 490499bc MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-490499bc --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-490499bc

2019-07-18 06:33:04,973 490499bc MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-490499bc --format json --quiet --project apache-beam-testing
2019-07-18 06:33:05,620 490499bc MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-490499bc --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-490499bc

2019-07-18 06:33:05,623 490499bc MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-18 06:33:05,623 490499bc MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-18 06:33:05,623 490499bc MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-18 06:33:05,624 490499bc MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/490499bc/pkb.log>
2019-07-18 06:33:05,624 490499bc MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/490499bc/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3414

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3414/display/redirect?page=changes>

Changes:

[juta.staes] [BEAM-7630] add ITs for writing and reading bytes from pubsub

[pabloem] Adding logging and adding one more retry

[hannahjiang] BEAM-3645 improve test cases

[pabloem] [BEAM-7530] Add it test to read None values from BigQuery (#8875)

[github] [BEAM-7499] Fixup for tricky Reify testing issue (#9077)

[github] [BEAM-7641] Collect xunit statistics for Py ITs (#8952)

[github] [BEAM-4948, BEAM-6267, BEAM-5559, BEAM-7289] Update the version of guava

[udim] [BEAM-7484] Metrics collection in BigQuery perf tests (#8766)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 76bc70d49aa88789bcd230ad752fcbb6ec83d4a0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 76bc70d49aa88789bcd230ad752fcbb6ec83d4a0
Commit message: "[BEAM-7484] Metrics collection in BigQuery perf tests (#8766)"
 > git rev-list --no-walk adb281f7e7b3dc9076223e49bc83ca15c9e3f8c7 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1552256601817902989.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8749072597404250553.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8641834938785415031.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6573430097579858728.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2383737860357487077.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1013569439994843469.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5852962021960926265.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-18 00:36:01,433 c0bafe3f MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c0bafe3f/pkb.log>
2019-07-18 00:36:01,434 c0bafe3f MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-18 00:36:01,435 c0bafe3f MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-18 00:36:01,662 c0bafe3f MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-18 00:36:01,687 c0bafe3f MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-18 00:36:01,710 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-18 00:36:01,712 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-18 00:36:01,714 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-c0bafe3f --format json --quiet --project apache-beam-testing
2019-07-18 00:36:02,364 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-c0bafe3f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-c0bafe3f

2019-07-18 00:36:02,365 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-c0bafe3f --format json --quiet --project apache-beam-testing
2019-07-18 00:36:02,920 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-c0bafe3f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-c0bafe3f

2019-07-18 00:36:02,923 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-18 00:36:02,923 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-18 00:36:02,924 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-18 00:36:02,924 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c0bafe3f/pkb.log>
2019-07-18 00:36:02,924 c0bafe3f MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c0bafe3f/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3413

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3413/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-4420] Extract method for publishing already collected metrics

[lukasz.gajowy] [BEAM-4420] Collect & write metrics to BigQuery and console

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision adb281f7e7b3dc9076223e49bc83ca15c9e3f8c7 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f adb281f7e7b3dc9076223e49bc83ca15c9e3f8c7
Commit message: "Merge pull request #9073: [BEAM-4420] KafkaIOIT metrics collection"
 > git rev-list --no-walk ccfd31e025be37de5dfbc46d036f2efe648e3c92 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins107896628099320124.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6719015815266168282.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7923870800939700327.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6290379808838607065.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1081884296705905172.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5810364501226164578.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5207287284644521169.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-17 18:36:35,333 1cffcb9f MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1cffcb9f/pkb.log>
2019-07-17 18:36:35,334 1cffcb9f MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-17 18:36:35,336 1cffcb9f MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-17 18:36:35,620 1cffcb9f MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-17 18:36:35,647 1cffcb9f MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-17 18:36:35,672 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-17 18:36:35,675 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-17 18:36:35,676 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-1cffcb9f --format json --quiet --project apache-beam-testing
2019-07-17 18:36:37,207 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-1cffcb9f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-1cffcb9f

2019-07-17 18:36:37,209 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-1cffcb9f --format json --quiet --project apache-beam-testing
2019-07-17 18:36:37,825 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-1cffcb9f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-1cffcb9f

2019-07-17 18:36:37,827 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-17 18:36:37,828 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-17 18:36:37,828 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-17 18:36:37,829 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1cffcb9f/pkb.log>
2019-07-17 18:36:37,829 1cffcb9f MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1cffcb9f/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3412

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3412/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7557] - Migrate DynamoDBIO to AWS SDK for Java 2

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-8 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision ccfd31e025be37de5dfbc46d036f2efe648e3c92 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ccfd31e025be37de5dfbc46d036f2efe648e3c92
Commit message: "Merge pull request #9086: [BEAM-7557] Migrate DynamoDBIO to AWS SDK for Java 2"
 > git rev-list --no-walk a0ec59b8df53cb5bc0ab2196ce09ac954cf8d481 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8961537323691496833.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2427075023301116045.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1886078656093651801.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7537630482898404853.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2111058028599471482.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5146306783077342025.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2349725170553938392.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-17 12:35:55,555 d5355e5f MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d5355e5f/pkb.log>
2019-07-17 12:35:55,556 d5355e5f MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-17 12:35:55,557 d5355e5f MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-17 12:35:55,779 d5355e5f MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-17 12:35:55,804 d5355e5f MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-17 12:35:55,825 d5355e5f MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-17 12:35:55,828 d5355e5f MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-17 12:35:55,829 d5355e5f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-d5355e5f --format json --quiet --project apache-beam-testing
2019-07-17 12:35:56,924 d5355e5f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-d5355e5f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-d5355e5f

2019-07-17 12:35:56,925 d5355e5f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-d5355e5f --format json --quiet --project apache-beam-testing
2019-07-17 12:35:57,469 d5355e5f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-d5355e5f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-d5355e5f

2019-07-17 12:35:57,471 d5355e5f MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-17 12:35:57,472 d5355e5f MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-17 12:35:57,472 d5355e5f MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-17 12:35:57,472 d5355e5f MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d5355e5f/pkb.log>
2019-07-17 12:35:57,472 d5355e5f MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d5355e5f/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3411

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3411/display/redirect?page=changes>

Changes:

[dcavazos] Add Python snippet for Regex transform

[kmj] Fix stream position bug in BigQuery Storage stream source.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a0ec59b8df53cb5bc0ab2196ce09ac954cf8d481 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a0ec59b8df53cb5bc0ab2196ce09ac954cf8d481
Commit message: "Merge pull request #9059: [BEAM-7743] Fix stream position bug in BigQuery Storage stream source."
 > git rev-list --no-walk 4a7ba2d2d4a13fefb7b9ec9bd73627584e2663be # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1424944920663992159.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9002204545253976063.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8511707308954426861.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5477491926974849142.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6819350525890887483.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1257038300948865852.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3066711631821821629.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-17 06:24:26,705 45821944 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/45821944/pkb.log>
2019-07-17 06:24:26,706 45821944 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-17 06:24:26,707 45821944 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-17 06:24:26,990 45821944 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-17 06:24:27,013 45821944 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-17 06:24:27,035 45821944 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-17 06:24:27,037 45821944 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-17 06:24:27,038 45821944 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-45821944 --format json --quiet --project apache-beam-testing
2019-07-17 06:24:28,309 45821944 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-45821944 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-45821944

2019-07-17 06:24:28,309 45821944 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-45821944 --format json --quiet --project apache-beam-testing
2019-07-17 06:24:28,870 45821944 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-45821944 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-45821944

2019-07-17 06:24:28,873 45821944 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-17 06:24:28,873 45821944 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-17 06:24:28,874 45821944 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-17 06:24:28,874 45821944 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/45821944/pkb.log>
2019-07-17 06:24:28,874 45821944 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/45821944/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3410

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3410/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7632] Update Python quickstart guide for Flink and Spark

[htyleo] [BEAM-7665] Support TypeDefinition options in beam.Combine()

[htyleo] Revise the comments to only mention the TypeDefinition options.

[github] Label the ParDos with their name

[ttanay100] [BEAM-7674] Combine batch and streaming BQ Streaming Insert ITs

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-3 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4a7ba2d2d4a13fefb7b9ec9bd73627584e2663be (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4a7ba2d2d4a13fefb7b9ec9bd73627584e2663be
Commit message: "Merge pull request #8949 from ibzib/quickstart"
 > git rev-list --no-walk 88c3f6b380f287ec0f9a1f803d55cd3880fe7727 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1941877554379519036.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1934778581442108772.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5212738092344701618.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2897654593984309375.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8783328120871633337.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7751087659887121026.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins933190023789625605.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-17 00:39:07,292 7581025c MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7581025c/pkb.log>
2019-07-17 00:39:07,293 7581025c MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-17 00:39:07,294 7581025c MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-17 00:39:07,572 7581025c MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-17 00:39:07,596 7581025c MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-17 00:39:07,619 7581025c MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-17 00:39:07,621 7581025c MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-17 00:39:07,623 7581025c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-7581025c --format json --quiet --project apache-beam-testing
2019-07-17 00:39:08,842 7581025c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-7581025c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-7581025c

2019-07-17 00:39:08,843 7581025c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-7581025c --format json --quiet --project apache-beam-testing
2019-07-17 00:39:09,419 7581025c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-7581025c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-7581025c

2019-07-17 00:39:09,422 7581025c MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-17 00:39:09,422 7581025c MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-17 00:39:09,422 7581025c MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-17 00:39:09,423 7581025c MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7581025c/pkb.log>
2019-07-17 00:39:09,423 7581025c MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7581025c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3409

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3409/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7715] Mark user facing APIs related to External transforms as

[udim] [BEAM-7578] add py37 hdfs integration test (#8970)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 88c3f6b380f287ec0f9a1f803d55cd3880fe7727 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 88c3f6b380f287ec0f9a1f803d55cd3880fe7727
Commit message: "[BEAM-7578] add py37 hdfs integration test (#8970)"
 > git rev-list --no-walk 41478d00d34598e56471d99d0845ac16efa5b8ef # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5748459971688486881.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8911648399430465846.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3356163746394614295.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7055443871497847618.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4174544033178650918.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6486959932233455676.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5196344414582732067.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-16 18:32:00,327 97c7a992 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/97c7a992/pkb.log>
2019-07-16 18:32:00,328 97c7a992 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-16 18:32:00,329 97c7a992 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-16 18:32:00,818 97c7a992 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-16 18:32:00,843 97c7a992 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-16 18:32:00,864 97c7a992 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-16 18:32:00,867 97c7a992 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-16 18:32:00,868 97c7a992 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-97c7a992 --format json --quiet --project apache-beam-testing
2019-07-16 18:32:02,169 97c7a992 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-97c7a992 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-97c7a992

2019-07-16 18:32:02,170 97c7a992 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-97c7a992 --format json --quiet --project apache-beam-testing
2019-07-16 18:32:02,751 97c7a992 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-97c7a992 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-97c7a992

2019-07-16 18:32:02,754 97c7a992 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-16 18:32:02,755 97c7a992 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-16 18:32:02,755 97c7a992 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-16 18:32:02,755 97c7a992 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/97c7a992/pkb.log>
2019-07-16 18:32:02,755 97c7a992 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/97c7a992/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3408

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3408/display/redirect?page=changes>

Changes:

[lgajowy] [BEAM-6675] Generate JDBC statement and preparedStatementSetter

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 41478d00d34598e56471d99d0845ac16efa5b8ef (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 41478d00d34598e56471d99d0845ac16efa5b8ef
Commit message: "[BEAM-6675] Generate JDBC statement and preparedStatementSetter automatically when schema is available (#8962)"
 > git rev-list --no-walk 708cea559848398dbe8a0b1071a3e391bf74acee # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4519326050161789018.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins616745393176966616.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5014115166902055192.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5190467396290213975.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3506543016407016804.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3016171308214442729.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3748789290080073825.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-16 12:22:55,379 64f0f4dd MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/64f0f4dd/pkb.log>
2019-07-16 12:22:55,380 64f0f4dd MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-16 12:22:55,381 64f0f4dd MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-16 12:22:55,776 64f0f4dd MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-16 12:22:55,800 64f0f4dd MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-16 12:22:55,822 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-16 12:22:55,824 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-16 12:22:55,825 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-64f0f4dd --format json --quiet --project apache-beam-testing
2019-07-16 12:22:57,195 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-64f0f4dd --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-64f0f4dd

2019-07-16 12:22:57,196 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-64f0f4dd --format json --quiet --project apache-beam-testing
2019-07-16 12:22:57,781 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-64f0f4dd --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-64f0f4dd

2019-07-16 12:22:57,784 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-16 12:22:57,784 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-16 12:22:57,784 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-16 12:22:57,785 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/64f0f4dd/pkb.log>
2019-07-16 12:22:57,785 64f0f4dd MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/64f0f4dd/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3407

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3407/display/redirect?page=changes>

Changes:

[rezarokni] Edit of Looping Timer Blog to fix issue with Timer State

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 708cea559848398dbe8a0b1071a3e391bf74acee (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 708cea559848398dbe8a0b1071a3e391bf74acee
Commit message: "Merge pull request #9010: Edit of Looping Timer Blog to fix issue with Timer State"
 > git rev-list --no-walk 8001ad2926a6df2a35f765c71aa62e51c3d249e1 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3081669437052597530.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7523927694517609812.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3237444940729560029.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1190507771803984627.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins867253330736873666.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8306298424772948990.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins931999128538894604.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-16 06:18:05,829 4978687e MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4978687e/pkb.log>
2019-07-16 06:18:05,829 4978687e MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-16 06:18:05,831 4978687e MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-16 06:18:06,080 4978687e MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-16 06:18:06,104 4978687e MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-16 06:18:06,125 4978687e MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-16 06:18:06,127 4978687e MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-16 06:18:06,128 4978687e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-4978687e --format json --quiet --project apache-beam-testing
2019-07-16 06:18:07,465 4978687e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-4978687e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-4978687e

2019-07-16 06:18:07,466 4978687e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-4978687e --format json --quiet --project apache-beam-testing
2019-07-16 06:18:08,024 4978687e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-4978687e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-4978687e

2019-07-16 06:18:08,027 4978687e MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-16 06:18:08,028 4978687e MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-16 06:18:08,028 4978687e MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-16 06:18:08,028 4978687e MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4978687e/pkb.log>
2019-07-16 06:18:08,028 4978687e MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4978687e/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3406

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3406/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-7656] Add sdk-worker-parallelism arg to flink job server shadow

[dcrhodes] [BEAM-7666] Memory monitor change

[dcrhodes] [BEAM-7666] Adds the counter

[boyuanz] Add estimate_size() to source_test.LineSource

[boyuanz] fix lint

[boyuanz] Add _get_file_size and UT

[ankurgoenka] [BEAM-7546] Increasing environment cache to avoid chances of recreating

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 8001ad2926a6df2a35f765c71aa62e51c3d249e1 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 8001ad2926a6df2a35f765c71aa62e51c3d249e1
Commit message: "Merge pull request #8971 from dustin12/ThrottlePiping"
 > git rev-list --no-walk 5fb21e38d9d0e73db514e13a93c15578302c11fa # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6863064125231228128.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5829209849105785434.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8668062868823303885.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2200524128196554085.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5710504538352014026.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5757749805572749638.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5730131866482567116.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-16 00:17:42,040 8bf0295e MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8bf0295e/pkb.log>
2019-07-16 00:17:42,041 8bf0295e MainThread INFO     PerfKitBenchmarker version: v1.12.0-1270-g54d0d46
2019-07-16 00:17:42,042 8bf0295e MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-16 00:17:42,361 8bf0295e MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-16 00:17:42,387 8bf0295e MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-16 00:17:42,411 8bf0295e MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-16 00:17:42,413 8bf0295e MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-16 00:17:42,414 8bf0295e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-8bf0295e --format json --quiet --project apache-beam-testing
2019-07-16 00:17:43,033 8bf0295e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-8bf0295e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-8bf0295e

2019-07-16 00:17:43,034 8bf0295e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-8bf0295e --format json --quiet --project apache-beam-testing
2019-07-16 00:17:43,627 8bf0295e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-8bf0295e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-8bf0295e

2019-07-16 00:17:43,630 8bf0295e MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-16 00:17:43,630 8bf0295e MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-16 00:17:43,631 8bf0295e MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-16 00:17:43,631 8bf0295e MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8bf0295e/pkb.log>
2019-07-16 00:17:43,631 8bf0295e MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/8bf0295e/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3405

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3405/display/redirect?page=changes>

Changes:

[yanzhi.wyl] [BEAM-7694]Fix error spelling in annotation of SparkTransformOverrides.

[yanzhi.wyl] update annotation for PTransformMatcher.

[sniemitz] Done notifications for BigtableIO.Write

[ankurgoenka] [BEAM-7736] Free worker when work is skipped

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5fb21e38d9d0e73db514e13a93c15578302c11fa (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5fb21e38d9d0e73db514e13a93c15578302c11fa
Commit message: "Merge pull request #7805: [BEAM-3061] Done notifications for BigtableIO.Write"
 > git rev-list --no-walk a35373308d3901ca729a21b7c797c5288ec09671 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4865348809230032720.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7044279003834944998.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5810192045846268520.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins212589184474314164.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1331435803745000647.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4985824359623919745.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5546810722524724213.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-15 18:25:11,002 3374c662 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3374c662/pkb.log>
2019-07-15 18:25:11,002 3374c662 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-15 18:25:11,004 3374c662 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-15 18:25:11,363 3374c662 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-15 18:25:11,388 3374c662 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-15 18:25:11,415 3374c662 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-15 18:25:11,417 3374c662 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-15 18:25:11,419 3374c662 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-3374c662 --format json --quiet --project apache-beam-testing
2019-07-15 18:25:16,183 3374c662 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-3374c662 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-3374c662

2019-07-15 18:25:16,184 3374c662 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-3374c662 --format json --quiet --project apache-beam-testing
2019-07-15 18:25:16,790 3374c662 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-3374c662 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-3374c662

2019-07-15 18:25:16,792 3374c662 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-15 18:25:16,793 3374c662 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-15 18:25:16,793 3374c662 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-15 18:25:16,794 3374c662 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3374c662/pkb.log>
2019-07-15 18:25:16,794 3374c662 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/3374c662/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3404

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3404/display/redirect?page=changes>

Changes:

[github] Some trivial typos

[robertwb] [BEAM-7737] Fix microbenchmark scripts compiled check (#9066)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-3 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a35373308d3901ca729a21b7c797c5288ec09671 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a35373308d3901ca729a21b7c797c5288ec09671
Commit message: "[BEAM-7737] Fix microbenchmark scripts compiled check (#9066)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7507379126688731340.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4938744963581137216.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins611191492513971867.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3753348810561616217.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4033350004427864376.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7247216053540671418.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4179481744917168791.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-15 12:30:07,524 7e78b536 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7e78b536/pkb.log>
2019-07-15 12:30:07,525 7e78b536 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-15 12:30:07,526 7e78b536 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-15 12:30:07,732 7e78b536 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-15 12:30:07,757 7e78b536 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-15 12:30:07,781 7e78b536 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-15 12:30:07,783 7e78b536 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-15 12:30:07,784 7e78b536 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-7e78b536 --format json --quiet --project apache-beam-testing
2019-07-15 12:30:09,029 7e78b536 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-7e78b536 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-7e78b536

2019-07-15 12:30:09,030 7e78b536 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-7e78b536 --format json --quiet --project apache-beam-testing
2019-07-15 12:30:09,632 7e78b536 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-7e78b536 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-7e78b536

2019-07-15 12:30:09,635 7e78b536 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-15 12:30:09,636 7e78b536 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-15 12:30:09,636 7e78b536 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-15 12:30:09,636 7e78b536 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7e78b536/pkb.log>
2019-07-15 12:30:09,636 7e78b536 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7e78b536/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3403

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3403/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6014834793488760237.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4253883855928538693.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5014361378373793872.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1137945378757062751.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2667795586014270084.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8788636556571743092.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2798954504410861641.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-15 06:16:38,247 53bc6f23 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/53bc6f23/pkb.log>
2019-07-15 06:16:38,248 53bc6f23 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-15 06:16:38,249 53bc6f23 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-15 06:16:38,446 53bc6f23 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-15 06:16:38,471 53bc6f23 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-15 06:16:38,496 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-15 06:16:38,499 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-15 06:16:38,500 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-53bc6f23 --format json --quiet --project apache-beam-testing
2019-07-15 06:16:39,126 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-53bc6f23 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-53bc6f23

2019-07-15 06:16:39,127 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-53bc6f23 --format json --quiet --project apache-beam-testing
2019-07-15 06:16:39,714 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-53bc6f23 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-53bc6f23

2019-07-15 06:16:39,716 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-15 06:16:39,717 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-15 06:16:39,717 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-15 06:16:39,718 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/53bc6f23/pkb.log>
2019-07-15 06:16:39,718 53bc6f23 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/53bc6f23/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3402

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3402/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1804317361608040373.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6103288343084027163.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8040888498474303771.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4095966132710050036.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4328960475809319365.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4656696979495471949.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7867830407969318858.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-15 00:16:27,801 6500dfa5 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/6500dfa5/pkb.log>
2019-07-15 00:16:27,802 6500dfa5 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-15 00:16:27,803 6500dfa5 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-15 00:16:27,985 6500dfa5 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-15 00:16:28,011 6500dfa5 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-15 00:16:28,035 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-15 00:16:28,038 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-15 00:16:28,039 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-6500dfa5 --format json --quiet --project apache-beam-testing
2019-07-15 00:16:28,640 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-6500dfa5 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-6500dfa5

2019-07-15 00:16:28,640 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-6500dfa5 --format json --quiet --project apache-beam-testing
2019-07-15 00:16:29,208 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-6500dfa5 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-6500dfa5

2019-07-15 00:16:29,211 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-15 00:16:29,212 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-15 00:16:29,212 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-15 00:16:29,212 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/6500dfa5/pkb.log>
2019-07-15 00:16:29,212 6500dfa5 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/6500dfa5/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3401

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3401/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4197002437356096062.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3620541449792804728.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9211434282332739518.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1251944257135370878.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1563987577906185590.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4512556172628558510.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2441892338764442444.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-14 18:17:01,941 6f0f62f0 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/6f0f62f0/pkb.log>
2019-07-14 18:17:01,942 6f0f62f0 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-14 18:17:01,943 6f0f62f0 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-14 18:17:02,244 6f0f62f0 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-14 18:17:02,269 6f0f62f0 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-14 18:17:02,295 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-14 18:17:02,298 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-14 18:17:02,299 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-6f0f62f0 --format json --quiet --project apache-beam-testing
2019-07-14 18:17:02,926 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-6f0f62f0 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-6f0f62f0

2019-07-14 18:17:02,927 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-6f0f62f0 --format json --quiet --project apache-beam-testing
2019-07-14 18:17:03,540 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-6f0f62f0 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-6f0f62f0

2019-07-14 18:17:03,542 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-14 18:17:03,543 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-14 18:17:03,543 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-14 18:17:03,543 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/6f0f62f0/pkb.log>
2019-07-14 18:17:03,543 6f0f62f0 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/6f0f62f0/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3400

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3400/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1158629386200829836.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7056316462255378031.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6300925477438278868.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8880999125955408068.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1924494926835406003.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3170957919796244694.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7524898244659118918.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-14 12:18:23,940 041a20a4 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/041a20a4/pkb.log>
2019-07-14 12:18:23,954 041a20a4 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-14 12:18:23,957 041a20a4 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-14 12:18:24,496 041a20a4 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-14 12:18:24,649 041a20a4 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-14 12:18:24,789 041a20a4 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-14 12:18:24,806 041a20a4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-14 12:18:24,812 041a20a4 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-041a20a4 --format json --quiet --project apache-beam-testing
2019-07-14 12:18:28,398 041a20a4 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-041a20a4 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-041a20a4

2019-07-14 12:18:28,399 041a20a4 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-041a20a4 --format json --quiet --project apache-beam-testing
2019-07-14 12:18:31,206 041a20a4 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-041a20a4 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-041a20a4

2019-07-14 12:18:31,210 041a20a4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-14 12:18:31,210 041a20a4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-14 12:18:31,211 041a20a4 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-14 12:18:31,211 041a20a4 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/041a20a4/pkb.log>
2019-07-14 12:18:31,211 041a20a4 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/041a20a4/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3399

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3399/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4406813431891938787.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4591969063982295830.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6628471045203817836.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5613768179290684626.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins770659047238977686.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3507536166771517066.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1842582884513776995.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-14 06:15:16,771 aee98abc MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/aee98abc/pkb.log>
2019-07-14 06:15:16,772 aee98abc MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-14 06:15:16,773 aee98abc MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-14 06:15:17,072 aee98abc MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-14 06:15:17,096 aee98abc MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-14 06:15:17,117 aee98abc MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-14 06:15:17,120 aee98abc MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-14 06:15:17,121 aee98abc MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-aee98abc --format json --quiet --project apache-beam-testing
2019-07-14 06:15:18,681 aee98abc MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-aee98abc --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-aee98abc

2019-07-14 06:15:18,682 aee98abc MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-aee98abc --format json --quiet --project apache-beam-testing
2019-07-14 06:15:19,223 aee98abc MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-aee98abc --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-aee98abc

2019-07-14 06:15:19,226 aee98abc MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-14 06:15:19,226 aee98abc MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-14 06:15:19,226 aee98abc MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-14 06:15:19,227 aee98abc MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/aee98abc/pkb.log>
2019-07-14 06:15:19,227 aee98abc MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/aee98abc/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3398

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3398/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1766689519862120611.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5149427196164970287.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2216998134596309370.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7286939799486805027.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4059220843301383511.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2733009899735678980.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2053820353666194682.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-14 00:17:21,388 2fe0a677 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2fe0a677/pkb.log>
2019-07-14 00:17:21,389 2fe0a677 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-14 00:17:21,390 2fe0a677 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-14 00:17:21,786 2fe0a677 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-14 00:17:21,810 2fe0a677 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-14 00:17:21,839 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-14 00:17:21,841 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-14 00:17:21,843 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-2fe0a677 --format json --quiet --project apache-beam-testing
2019-07-14 00:17:22,392 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-2fe0a677 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-2fe0a677

2019-07-14 00:17:22,393 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-2fe0a677 --format json --quiet --project apache-beam-testing
2019-07-14 00:17:22,905 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-2fe0a677 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-2fe0a677

2019-07-14 00:17:22,908 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-14 00:17:22,908 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-14 00:17:22,909 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-14 00:17:22,909 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2fe0a677/pkb.log>
2019-07-14 00:17:22,909 2fe0a677 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2fe0a677/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3397

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3397/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7796636437785694759.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4244269596740255520.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1887220911686653697.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8450241435036626631.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8684339653943145786.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3937635581142339022.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6821332626960813623.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-13 18:23:18,992 712a4db0 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/712a4db0/pkb.log>
2019-07-13 18:23:18,993 712a4db0 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-13 18:23:18,994 712a4db0 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-13 18:23:19,307 712a4db0 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-13 18:23:19,332 712a4db0 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-13 18:23:19,357 712a4db0 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-13 18:23:19,359 712a4db0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-13 18:23:19,361 712a4db0 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-712a4db0 --format json --quiet --project apache-beam-testing
2019-07-13 18:23:19,924 712a4db0 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-712a4db0 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-712a4db0

2019-07-13 18:23:19,925 712a4db0 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-712a4db0 --format json --quiet --project apache-beam-testing
2019-07-13 18:23:20,473 712a4db0 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-712a4db0 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-712a4db0

2019-07-13 18:23:20,476 712a4db0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-13 18:23:20,477 712a4db0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-13 18:23:20,477 712a4db0 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-13 18:23:20,477 712a4db0 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/712a4db0/pkb.log>
2019-07-13 18:23:20,477 712a4db0 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/712a4db0/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3396

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3396/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7918358175791298550.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2115956977339046752.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins273837798920807726.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7912214987832909666.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins763193452337046910.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2312626550513150741.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8342244709135857657.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-13 12:19:03,115 734d4f5c MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/734d4f5c/pkb.log>
2019-07-13 12:19:03,115 734d4f5c MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-13 12:19:03,117 734d4f5c MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-13 12:19:03,247 734d4f5c MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-13 12:19:03,271 734d4f5c MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-13 12:19:03,294 734d4f5c MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-13 12:19:03,296 734d4f5c MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-13 12:19:03,297 734d4f5c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-734d4f5c --format json --quiet --project apache-beam-testing
2019-07-13 12:19:04,475 734d4f5c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-734d4f5c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-734d4f5c

2019-07-13 12:19:04,476 734d4f5c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-734d4f5c --format json --quiet --project apache-beam-testing
2019-07-13 12:19:05,023 734d4f5c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-734d4f5c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-734d4f5c

2019-07-13 12:19:05,026 734d4f5c MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-13 12:19:05,027 734d4f5c MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-13 12:19:05,027 734d4f5c MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-13 12:19:05,027 734d4f5c MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/734d4f5c/pkb.log>
2019-07-13 12:19:05,027 734d4f5c MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/734d4f5c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3395

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3395/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk f7cbf88f550c8918b99a13af4182d6efa07cd2b5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5791796350232277141.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2554472614243629200.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4182634376382536427.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1185122861213982105.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1667101846274046552.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4949488712761280847.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8409495067749375394.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-13 06:18:11,412 9348b244 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9348b244/pkb.log>
2019-07-13 06:18:11,412 9348b244 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-13 06:18:11,413 9348b244 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-13 06:18:11,592 9348b244 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-13 06:18:11,617 9348b244 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-13 06:18:11,639 9348b244 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-13 06:18:11,642 9348b244 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-13 06:18:11,643 9348b244 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-9348b244 --format json --quiet --project apache-beam-testing
2019-07-13 06:18:12,794 9348b244 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-9348b244 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-9348b244

2019-07-13 06:18:12,795 9348b244 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-9348b244 --format json --quiet --project apache-beam-testing
2019-07-13 06:18:13,376 9348b244 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-9348b244 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-9348b244

2019-07-13 06:18:13,379 9348b244 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-13 06:18:13,379 9348b244 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-13 06:18:13,379 9348b244 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-13 06:18:13,379 9348b244 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9348b244/pkb.log>
2019-07-13 06:18:13,379 9348b244 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9348b244/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3394

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3394/display/redirect?page=changes>

Changes:

[b_m.vishwas] [BEAM-7731] Adding helper function to handle if null else pattern for

[udim] [BEAM-7437] Raise RuntimeError for PY2 in

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f7cbf88f550c8918b99a13af4182d6efa07cd2b5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f7cbf88f550c8918b99a13af4182d6efa07cd2b5
Commit message: "[BEAM-7437] Raise RuntimeError for PY2 in BigqueryFullResultStreamingMatcher (#9044)"
 > git rev-list --no-walk 60f70bf76faf57f7575de2723f33630f7ca583a9 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5084386530965672789.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins319589723863897781.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5305001731390853087.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6950617106018316340.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6648926562946115799.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2748526654097711930.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7747555255470982963.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-13 00:22:36,871 d055288f MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d055288f/pkb.log>
2019-07-13 00:22:36,871 d055288f MainThread INFO     PerfKitBenchmarker version: v1.12.0-1267-g580fa34
2019-07-13 00:22:36,873 d055288f MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-13 00:22:37,094 d055288f MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-13 00:22:37,124 d055288f MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-13 00:22:37,148 d055288f MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-13 00:22:37,150 d055288f MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-13 00:22:37,151 d055288f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-d055288f --format json --quiet --project apache-beam-testing
2019-07-13 00:22:38,480 d055288f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-d055288f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-d055288f

2019-07-13 00:22:38,481 d055288f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-d055288f --format json --quiet --project apache-beam-testing
2019-07-13 00:22:39,121 d055288f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-d055288f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-d055288f

2019-07-13 00:22:39,124 d055288f MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-13 00:22:39,124 d055288f MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-13 00:22:39,125 d055288f MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-13 00:22:39,125 d055288f MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d055288f/pkb.log>
2019-07-13 00:22:39,125 d055288f MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d055288f/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3393

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3393/display/redirect?page=changes>

Changes:

[alireza4263] [BEAM-7729] Fixes the bug by checking the value first before parsing it.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 60f70bf76faf57f7575de2723f33630f7ca583a9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 60f70bf76faf57f7575de2723f33630f7ca583a9
Commit message: "Merge pull request #9045 from riazela/BigQueryNullableBugFix"
 > git rev-list --no-walk 0fce2b88660f52dae638697e1472aa108c982ae6 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1138945914148800870.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8009528647251606844.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4087740467862744891.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4785224167833088917.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5853477992810638046.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5443381567317540228.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8133391701816444835.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-12 18:30:10,120 10d0557c MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/10d0557c/pkb.log>
2019-07-12 18:30:10,120 10d0557c MainThread INFO     PerfKitBenchmarker version: v1.12.0-1264-gc6a08b4
2019-07-12 18:30:10,121 10d0557c MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-12 18:30:10,329 10d0557c MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-12 18:30:10,352 10d0557c MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-12 18:30:10,372 10d0557c MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-12 18:30:10,374 10d0557c MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-12 18:30:10,375 10d0557c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-10d0557c --format json --quiet --project apache-beam-testing
2019-07-12 18:30:12,082 10d0557c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-10d0557c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-10d0557c

2019-07-12 18:30:12,082 10d0557c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-10d0557c --format json --quiet --project apache-beam-testing
2019-07-12 18:30:12,619 10d0557c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-10d0557c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-10d0557c

2019-07-12 18:30:12,621 10d0557c MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-12 18:30:12,622 10d0557c MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-12 18:30:12,622 10d0557c MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-12 18:30:12,622 10d0557c MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/10d0557c/pkb.log>
2019-07-12 18:30:12,622 10d0557c MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/10d0557c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3392

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3392/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0fce2b88660f52dae638697e1472aa108c982ae6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0fce2b88660f52dae638697e1472aa108c982ae6
Commit message: "Merge pull request #9038 from lukecwik/vendor2"
 > git rev-list --no-walk 0fce2b88660f52dae638697e1472aa108c982ae6 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins631481527206115676.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2913362430342671897.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7613152465079833288.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6028171242386626550.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8853243145820358359.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins555365039756592212.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6048374905934540533.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-12 12:25:10,617 bfc7b334 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/bfc7b334/pkb.log>
2019-07-12 12:25:10,617 bfc7b334 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1264-gc6a08b4
2019-07-12 12:25:10,619 bfc7b334 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-12 12:25:10,931 bfc7b334 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-12 12:25:10,954 bfc7b334 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-12 12:25:10,975 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-12 12:25:10,978 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-12 12:25:10,979 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-bfc7b334 --format json --quiet --project apache-beam-testing
2019-07-12 12:25:12,404 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-bfc7b334 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-bfc7b334

2019-07-12 12:25:12,405 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-bfc7b334 --format json --quiet --project apache-beam-testing
2019-07-12 12:25:12,949 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-bfc7b334 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-bfc7b334

2019-07-12 12:25:12,951 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-12 12:25:12,952 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-12 12:25:12,952 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-12 12:25:12,952 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/bfc7b334/pkb.log>
2019-07-12 12:25:12,953 bfc7b334 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/bfc7b334/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3391

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3391/display/redirect?page=changes>

Changes:

[lcwik] Ensure that publishing vendored artifacts checks the contents of the jar

[lcwik] [BEAM-4948, BEAM-6267, BEAM-5559, BEAM-7289] Fix shading of vendored

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 0fce2b88660f52dae638697e1472aa108c982ae6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 0fce2b88660f52dae638697e1472aa108c982ae6
Commit message: "Merge pull request #9038 from lukecwik/vendor2"
 > git rev-list --no-walk 10fd794e7e06bc77bb2695d3c51714c16aa78c0c # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3526220773442611346.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2483577016898345758.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7146541556346374970.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7084562483080266236.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3047398917061660527.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7658339963741260738.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4379714921534681528.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-12 06:18:30,386 d2c69413 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d2c69413/pkb.log>
2019-07-12 06:18:30,387 d2c69413 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1264-gc6a08b4
2019-07-12 06:18:30,388 d2c69413 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-12 06:18:30,682 d2c69413 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-12 06:18:30,707 d2c69413 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-12 06:18:30,731 d2c69413 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-12 06:18:30,733 d2c69413 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-12 06:18:30,735 d2c69413 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-d2c69413 --format json --quiet --project apache-beam-testing
2019-07-12 06:18:32,268 d2c69413 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-d2c69413 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-d2c69413

2019-07-12 06:18:32,269 d2c69413 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-d2c69413 --format json --quiet --project apache-beam-testing
2019-07-12 06:18:32,946 d2c69413 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-d2c69413 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-d2c69413

2019-07-12 06:18:32,949 d2c69413 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-12 06:18:32,950 d2c69413 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-12 06:18:32,950 d2c69413 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-12 06:18:32,951 d2c69413 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d2c69413/pkb.log>
2019-07-12 06:18:32,951 d2c69413 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d2c69413/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3390

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3390/display/redirect?page=changes>

Changes:

[bmvishwas] [BEAM-7621] Null pointer exception when accessing null row fields in

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 10fd794e7e06bc77bb2695d3c51714c16aa78c0c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 10fd794e7e06bc77bb2695d3c51714c16aa78c0c
Commit message: "Merge pull request #8930: [BEAM-7621] Null pointer exception when accessing null row fields in BeamSql"
 > git rev-list --no-walk f2a2f2a791ceb9e838590ce1cd201321757c47dd # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2596840769210115148.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins613252999270519243.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2640293827038820245.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1064585955450829782.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1231856601779538930.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1811241630967135912.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8111988900368855096.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-12 00:22:23,761 c878c7a7 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c878c7a7/pkb.log>
2019-07-12 00:22:23,762 c878c7a7 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1263-g0af1037
2019-07-12 00:22:23,763 c878c7a7 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-12 00:22:24,352 c878c7a7 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-12 00:22:24,376 c878c7a7 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-12 00:22:24,397 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-12 00:22:24,399 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-12 00:22:24,400 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-c878c7a7 --format json --quiet --project apache-beam-testing
2019-07-12 00:22:24,985 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-c878c7a7 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-c878c7a7

2019-07-12 00:22:24,986 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-c878c7a7 --format json --quiet --project apache-beam-testing
2019-07-12 00:22:25,478 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-c878c7a7 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-c878c7a7

2019-07-12 00:22:25,480 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-12 00:22:25,481 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-12 00:22:25,481 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-12 00:22:25,481 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c878c7a7/pkb.log>
2019-07-12 00:22:25,481 c878c7a7 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c878c7a7/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3389

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3389/display/redirect?page=changes>

Changes:

[hsuryawirawan] Update Beam Katas (Java) course on Stepik

[hsuryawirawan] Update Beam Katas (Python) course on Stepik

[hsuryawirawan] Update the ParDo OneToMany task description to follow latest convention

[hsuryawirawan] Add Windowing Adding Timestamp using ParDo Java Kata

[hsuryawirawan] Add Windowing Adding Timestamp using WithTimestamps Java Kata

[hsuryawirawan] Add Fixed Time Window Java Kata

[hsuryawirawan] Add kata description for the Adding Timestamp katas

[hsuryawirawan] Rename 'util' package to 'org.apache.beam.learning.katas.util'

[hsuryawirawan] Add Triggers "Event Time Triggers" Java kata

[hsuryawirawan] Add "DoFn Additional Parameters" Java kata

[hsuryawirawan] Move Built-in IOs Task into a package

[hsuryawirawan] Add WithKeys Java kata

[hsuryawirawan] Change logging library to use log4j2 and add log4j2.xml in the util

[hsuryawirawan] Add "Early Triggers" Java kata

[hsuryawirawan] Update WithKeys task description javadoc link to use 'current'

[hsuryawirawan] Add "Window Accumulation Mode" Java kata

[hsuryawirawan] Add package for "Early Triggers" and "Window Accumulation Mode"

[hsuryawirawan] Update study_project.xml

[hsuryawirawan] Fix course syllabus on Stepik

[hsuryawirawan] Fix the wrong file path for TextIO Read kata

[hsuryawirawan] Update the allowed lateness and discarding accumulation mode kata

[hsuryawirawan] Update course to Stepik

[hsuryawirawan] Reupload Python Katas lessons

[github] Remove unused NEWLINE in TextSink.TextWriter

[kamil.wasilewski] [BEAM-7503] Added iteration parameter to CoGBK test in Python

[kamil.wasilewski] [BEAM-7503] Created CoGBK Python Load Test Jenkins job

[kamil.wasilewski] [BEAM-3959] Added a flake F821 test

[katarzyna.kucharczyk] [BEAM-5994] Fix condition which allows to publish metrics.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f2a2f2a791ceb9e838590ce1cd201321757c47dd (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f2a2f2a791ceb9e838590ce1cd201321757c47dd
Commit message: "Merge pull request #8989: Remove unused NEWLINE in TextSink.TextWriter"
 > git rev-list --no-walk 5b68aae06c705d2ee9e5588e88bc5a992f4edf64 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5457434224504003145.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3100028080494046879.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3713814011711373485.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6631684387526760790.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9141532258042263974.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3590672308359192995.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1891430261731696789.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-11 18:25:17,433 0b39f8d6 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0b39f8d6/pkb.log>
2019-07-11 18:25:17,433 0b39f8d6 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1258-ga2fb730
2019-07-11 18:25:17,434 0b39f8d6 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-11 18:25:17,579 0b39f8d6 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-11 18:25:17,602 0b39f8d6 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-11 18:25:17,624 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-11 18:25:17,626 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-11 18:25:17,627 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-0b39f8d6 --format json --quiet --project apache-beam-testing
2019-07-11 18:25:18,213 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-0b39f8d6 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0b39f8d6

2019-07-11 18:25:18,214 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-0b39f8d6 --format json --quiet --project apache-beam-testing
2019-07-11 18:25:18,724 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-0b39f8d6 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0b39f8d6

2019-07-11 18:25:18,726 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-11 18:25:18,727 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-11 18:25:18,727 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-11 18:25:18,727 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0b39f8d6/pkb.log>
2019-07-11 18:25:18,727 0b39f8d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0b39f8d6/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3388

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3388/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5b68aae06c705d2ee9e5588e88bc5a992f4edf64 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5b68aae06c705d2ee9e5588e88bc5a992f4edf64
Commit message: "Merge pull request #9013: [BEAM-7689] make a temporary directory unique for FileBaseSink"
 > git rev-list --no-walk 5b68aae06c705d2ee9e5588e88bc5a992f4edf64 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7101388449800797079.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6313144091788923816.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4632212804954175073.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2130239035923284692.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1070676730502324050.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6933295546557661925.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8002401601897827655.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-11 12:18:58,582 a7b9f52e MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a7b9f52e/pkb.log>
2019-07-11 12:18:58,582 a7b9f52e MainThread INFO     PerfKitBenchmarker version: v1.12.0-1257-gedc4934
2019-07-11 12:18:58,584 a7b9f52e MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-11 12:18:58,843 a7b9f52e MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-11 12:18:58,866 a7b9f52e MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-11 12:18:58,888 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-11 12:18:58,899 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-11 12:18:58,901 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-a7b9f52e --format json --quiet --project apache-beam-testing
2019-07-11 12:19:00,373 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-a7b9f52e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-a7b9f52e

2019-07-11 12:19:00,373 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-a7b9f52e --format json --quiet --project apache-beam-testing
2019-07-11 12:19:00,913 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-a7b9f52e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-a7b9f52e

2019-07-11 12:19:00,916 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-11 12:19:00,916 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-11 12:19:00,917 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-11 12:19:00,917 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a7b9f52e/pkb.log>
2019-07-11 12:19:00,917 a7b9f52e MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a7b9f52e/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3387

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3387/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5b68aae06c705d2ee9e5588e88bc5a992f4edf64 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5b68aae06c705d2ee9e5588e88bc5a992f4edf64
Commit message: "Merge pull request #9013: [BEAM-7689] make a temporary directory unique for FileBaseSink"
 > git rev-list --no-walk 5b68aae06c705d2ee9e5588e88bc5a992f4edf64 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6637340288717055868.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3573839567124738790.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1199669522495186561.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7848335763735025913.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins396006280985478675.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1645879347300772075.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9125615589078812360.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-11 06:16:31,352 96257893 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/96257893/pkb.log>
2019-07-11 06:16:31,353 96257893 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1257-gedc4934
2019-07-11 06:16:31,354 96257893 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-11 06:16:31,547 96257893 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-11 06:16:31,570 96257893 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-11 06:16:31,593 96257893 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-11 06:16:31,596 96257893 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-11 06:16:31,597 96257893 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-96257893 --format json --quiet --project apache-beam-testing
2019-07-11 06:16:32,315 96257893 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-96257893 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-96257893

2019-07-11 06:16:32,316 96257893 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-96257893 --format json --quiet --project apache-beam-testing
2019-07-11 06:16:32,949 96257893 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-96257893 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-96257893

2019-07-11 06:16:32,952 96257893 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-11 06:16:32,953 96257893 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-11 06:16:32,953 96257893 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-11 06:16:32,953 96257893 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/96257893/pkb.log>
2019-07-11 06:16:32,954 96257893 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/96257893/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3386

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3386/display/redirect?page=changes>

Changes:

[dcavazos] Add Python snippet for WithTimestamps transform

[kcweaver] [BEAM-7710] [website] remove outdated reference to KeyedCombineFn

[heejong] [BEAM-7689] make a temporary directory unique for FileBaseSink

[chamikara] [BEAM-7389] Add Python snippet for Partition transform (#8904)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5b68aae06c705d2ee9e5588e88bc5a992f4edf64 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5b68aae06c705d2ee9e5588e88bc5a992f4edf64
Commit message: "Merge pull request #9013: [BEAM-7689] make a temporary directory unique for FileBaseSink"
 > git rev-list --no-walk 05dba6a5b9fb7e1753262f8eed5854c8e19cd794 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins438699930347796053.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins356070249774373719.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2266519870095406536.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2849267834180409303.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3745634921405766814.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins257116985040517285.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4147838831352067741.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-11 00:15:30,976 c871840f MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c871840f/pkb.log>
2019-07-11 00:15:30,976 c871840f MainThread INFO     PerfKitBenchmarker version: v1.12.0-1257-gedc4934
2019-07-11 00:15:30,977 c871840f MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-11 00:15:31,237 c871840f MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-11 00:15:31,260 c871840f MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-11 00:15:31,283 c871840f MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-11 00:15:31,298 c871840f MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-11 00:15:31,300 c871840f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-c871840f --format json --quiet --project apache-beam-testing
2019-07-11 00:15:32,979 c871840f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-c871840f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-c871840f

2019-07-11 00:15:32,980 c871840f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-c871840f --format json --quiet --project apache-beam-testing
2019-07-11 00:15:33,560 c871840f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-c871840f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-c871840f

2019-07-11 00:15:33,563 c871840f MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-11 00:15:33,563 c871840f MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-11 00:15:33,563 c871840f MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-11 00:15:33,563 c871840f MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c871840f/pkb.log>
2019-07-11 00:15:33,564 c871840f MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/c871840f/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3385

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3385/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-7668] Add ability to query a pipeline definition from a gRPC

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 05dba6a5b9fb7e1753262f8eed5854c8e19cd794 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 05dba6a5b9fb7e1753262f8eed5854c8e19cd794
Commit message: "Merge PR #8977"
 > git rev-list --no-walk 337641758cf8d994f5ef3239885efdd343fc62f5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8333259460600807627.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1575421499023090053.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8765252557610809084.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins15525550512430085.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1970687577440765189.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7281985112428709170.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4355852693021565671.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-10 18:16:35,972 97574640 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/97574640/pkb.log>
2019-07-10 18:16:35,972 97574640 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1257-gedc4934
2019-07-10 18:16:35,973 97574640 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-10 18:16:36,120 97574640 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-10 18:16:36,143 97574640 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-10 18:16:36,166 97574640 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-10 18:16:36,168 97574640 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-10 18:16:36,170 97574640 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-97574640 --format json --quiet --project apache-beam-testing
2019-07-10 18:16:36,763 97574640 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-97574640 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-97574640

2019-07-10 18:16:36,763 97574640 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-97574640 --format json --quiet --project apache-beam-testing
2019-07-10 18:16:37,308 97574640 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-97574640 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-97574640

2019-07-10 18:16:37,311 97574640 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 140, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-10 18:16:37,311 97574640 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-10 18:16:37,311 97574640 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-10 18:16:37,311 97574640 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/97574640/pkb.log>
2019-07-10 18:16:37,312 97574640 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/97574640/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3384

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3384/display/redirect?page=changes>

Changes:

[valentyn] Use Beam's abstraction of pickler instead of dill in coder tests.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 337641758cf8d994f5ef3239885efdd343fc62f5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 337641758cf8d994f5ef3239885efdd343fc62f5
Commit message: "Merge pull request #8975 Use Beam's abstraction of pickler instead of dill in coder tests."
 > git rev-list --no-walk 4ba9989799efa5d9a0a76d89022fb6edd265255a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4806313125501986143.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins468856992224507673.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5673285614781012380.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins944230687620707118.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1396509152159563791.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8968686384736673998.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8567134609671506632.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-10 12:19:07,388 08e0ee97 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/08e0ee97/pkb.log>
2019-07-10 12:19:07,389 08e0ee97 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1255-gbda82fa
2019-07-10 12:19:07,390 08e0ee97 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-10 12:19:07,543 08e0ee97 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-10 12:19:07,566 08e0ee97 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-10 12:19:07,587 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-10 12:19:07,589 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-10 12:19:07,590 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-08e0ee97 --format json --quiet --project apache-beam-testing
2019-07-10 12:19:09,115 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-08e0ee97 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-08e0ee97

2019-07-10 12:19:09,116 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-08e0ee97 --format json --quiet --project apache-beam-testing
2019-07-10 12:19:09,799 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-08e0ee97 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-08e0ee97

2019-07-10 12:19:09,802 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-10 12:19:09,802 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-10 12:19:09,803 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-10 12:19:09,803 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/08e0ee97/pkb.log>
2019-07-10 12:19:09,803 08e0ee97 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/08e0ee97/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3383

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3383/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4ba9989799efa5d9a0a76d89022fb6edd265255a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4ba9989799efa5d9a0a76d89022fb6edd265255a
Commit message: "[BEAM-7690] Port WordCountTest off DoFnTester"
 > git rev-list --no-walk 4ba9989799efa5d9a0a76d89022fb6edd265255a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6401794432992583066.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4265626052721813912.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins493989526064132156.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5056915725200036773.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8569603233563700005.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7837262714106356766.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2388612483046500899.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-10 06:16:29,282 2601c5d2 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2601c5d2/pkb.log>
2019-07-10 06:16:29,282 2601c5d2 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1255-gbda82fa
2019-07-10 06:16:29,284 2601c5d2 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-10 06:16:29,589 2601c5d2 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-10 06:16:29,611 2601c5d2 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-10 06:16:29,634 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-10 06:16:29,636 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-10 06:16:29,637 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-2601c5d2 --format json --quiet --project apache-beam-testing
2019-07-10 06:16:31,271 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-2601c5d2 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-2601c5d2

2019-07-10 06:16:31,272 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-2601c5d2 --format json --quiet --project apache-beam-testing
2019-07-10 06:16:31,861 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-2601c5d2 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-2601c5d2

2019-07-10 06:16:31,863 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-10 06:16:31,864 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-10 06:16:31,864 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-10 06:16:31,864 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2601c5d2/pkb.log>
2019-07-10 06:16:31,864 2601c5d2 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2601c5d2/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3382

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3382/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7535] Created Jenkins job for BQ performance tests

[kamil.wasilewski] [BEAM-7535] Delete existing data if the table already exists

[hannahjiang] BEAM-3645 add thread lock

[cademarkegard] [BEAM-7690] Port WordCountTest off DoFnTester

[github] [BEAM-7709] Re-use node for explicit flattens

[boyuanz] Reformat CamelCase function naming style to underscore style for

[boyuanz] fix lint

[chambers] Update Python Dataflow runner to patch side input coders on the unified

[iemejia] [BEAM-7653] Add PTransformTranslator for Combine.GroupedValues

[pachristopher] Update pyarrow version requirement in setup.py

[github] Update code comments to improve readability in docs (#9024)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4ba9989799efa5d9a0a76d89022fb6edd265255a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4ba9989799efa5d9a0a76d89022fb6edd265255a
Commit message: "[BEAM-7690] Port WordCountTest off DoFnTester"
 > git rev-list --no-walk 66bd44f4ae41cd06e6b3d1cd0a05e12ec7d4b6fa # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3139602859399509005.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4947835635299278854.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3766282407445975838.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5382309590010603111.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5111212337883988765.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2906437907829870629.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8991183557130700066.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-10 00:18:40,339 f5112e8b MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/f5112e8b/pkb.log>
2019-07-10 00:18:40,340 f5112e8b MainThread INFO     PerfKitBenchmarker version: v1.12.0-1255-gbda82fa
2019-07-10 00:18:40,341 f5112e8b MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-10 00:18:40,676 f5112e8b MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-10 00:18:40,699 f5112e8b MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-10 00:18:40,722 f5112e8b MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-10 00:18:40,725 f5112e8b MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-10 00:18:40,727 f5112e8b MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-f5112e8b --format json --quiet --project apache-beam-testing
2019-07-10 00:18:41,331 f5112e8b MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-f5112e8b --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-f5112e8b

2019-07-10 00:18:41,332 f5112e8b MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-f5112e8b --format json --quiet --project apache-beam-testing
2019-07-10 00:18:41,908 f5112e8b MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-f5112e8b --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-f5112e8b

2019-07-10 00:18:41,911 f5112e8b MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 984, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 833, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 636, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-10 00:18:41,911 f5112e8b MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-10 00:18:41,911 f5112e8b MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-10 00:18:41,912 f5112e8b MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/f5112e8b/pkb.log>
2019-07-10 00:18:41,912 f5112e8b MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/f5112e8b/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3381

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3381/display/redirect?page=changes>

Changes:

[ttanay100] unskip ReifyTest.test_window

[ttanay100] [BEAM-7437] Add streaming flag to BQ streaming inserts IT test

[ttanay100] Change default timeout to 5 mins

[iemejia] [BEAM-6740] Add PTransformTranslator for Combine.Globally

[iemejia] [BEAM-6740] Add extractAcummulatorCoder for Combine.Globally and fix

[iemejia] [BEAM-7640] Change tests to use PayloadTranslator instead of unused

[iemejia] [BEAM-6740] Refactor to remove duplicated code in CombineTranslation

[kcweaver] [BEAM-7708] don't expect SQL shell bundled dependencies to be shadowed

[33895511+aromanenko-dev] [BEAM-6480] Adds AvroIO sink for generic records. (#9005)

[github] [SQL][Doc] fix broken gradle command.

[lcwik] Added new example on how to create a custom unbounded streaming source

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 66bd44f4ae41cd06e6b3d1cd0a05e12ec7d4b6fa (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 66bd44f4ae41cd06e6b3d1cd0a05e12ec7d4b6fa
Commit message: "Added new example on how to create a custom unbounded streaming source"
 > git rev-list --no-walk c2343c8f4feb96ca4a28b7e2bcfc872858b5b850 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5972783752889582813.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4116058693180477285.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins622569200072564861.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5547952523711923150.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3046612878562746548.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4267723206282392273.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2049377426736626065.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-09 18:26:31,855 ad3d6652 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/ad3d6652/pkb.log>
2019-07-09 18:26:31,855 ad3d6652 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-09 18:26:31,856 ad3d6652 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-09 18:26:32,170 ad3d6652 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-09 18:26:32,192 ad3d6652 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-09 18:26:32,213 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-09 18:26:32,215 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-09 18:26:32,216 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-ad3d6652 --format json --quiet --project apache-beam-testing
2019-07-09 18:26:32,806 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-ad3d6652 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-ad3d6652

2019-07-09 18:26:32,807 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-ad3d6652 --format json --quiet --project apache-beam-testing
2019-07-09 18:26:33,315 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-ad3d6652 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-ad3d6652

2019-07-09 18:26:33,318 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-09 18:26:33,318 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-09 18:26:33,319 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-09 18:26:33,319 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/ad3d6652/pkb.log>
2019-07-09 18:26:33,319 ad3d6652 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/ad3d6652/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3380

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3380/display/redirect?page=changes>

Changes:

[ryan] Consider Elasticsearch as one word in camelCase.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c2343c8f4feb96ca4a28b7e2bcfc872858b5b850 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c2343c8f4feb96ca4a28b7e2bcfc872858b5b850
Commit message: "Merge pull request #9008 from RyanSkraba/BEAM-7698-rename-elasticsearch"
 > git rev-list --no-walk 69bb363334fdfe7f812bbb2a0bc210419be0409d # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2685656202110489861.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6610504196107682525.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8687551365746093595.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4895403473170636010.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4227041339591143214.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7511809046355629551.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2562912325688912434.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-09 12:24:38,196 54d450a2 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/54d450a2/pkb.log>
2019-07-09 12:24:38,197 54d450a2 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-09 12:24:38,198 54d450a2 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-09 12:24:38,482 54d450a2 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-09 12:24:38,505 54d450a2 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-09 12:24:38,528 54d450a2 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-09 12:24:38,531 54d450a2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-09 12:24:38,532 54d450a2 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-54d450a2 --format json --quiet --project apache-beam-testing
2019-07-09 12:24:40,056 54d450a2 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-54d450a2 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-54d450a2

2019-07-09 12:24:40,057 54d450a2 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-54d450a2 --format json --quiet --project apache-beam-testing
2019-07-09 12:24:40,761 54d450a2 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-54d450a2 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-54d450a2

2019-07-09 12:24:40,764 54d450a2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-09 12:24:40,764 54d450a2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-09 12:24:40,765 54d450a2 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-09 12:24:40,765 54d450a2 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/54d450a2/pkb.log>
2019-07-09 12:24:40,765 54d450a2 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/54d450a2/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3379

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3379/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 69bb363334fdfe7f812bbb2a0bc210419be0409d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 69bb363334fdfe7f812bbb2a0bc210419be0409d
Commit message: "Merge pull request #8999 from boyuanzz/doc_fix"
 > git rev-list --no-walk 69bb363334fdfe7f812bbb2a0bc210419be0409d # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8602350353211702953.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7612736813652430268.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4103333212627878601.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2324829405243224916.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6750588883769396729.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6993618217024662326.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6773926478237170002.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-09 06:18:50,908 59fb1f80 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/59fb1f80/pkb.log>
2019-07-09 06:18:50,909 59fb1f80 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-09 06:18:50,911 59fb1f80 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-09 06:18:51,428 59fb1f80 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-09 06:18:51,460 59fb1f80 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-09 06:18:51,495 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-09 06:18:51,504 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-09 06:18:51,510 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-59fb1f80 --format json --quiet --project apache-beam-testing
2019-07-09 06:18:53,618 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-59fb1f80 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-59fb1f80

2019-07-09 06:18:53,619 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-59fb1f80 --format json --quiet --project apache-beam-testing
2019-07-09 06:18:54,636 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-59fb1f80 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-59fb1f80

2019-07-09 06:18:54,639 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-09 06:18:54,640 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-09 06:18:54,641 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-09 06:18:54,641 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/59fb1f80/pkb.log>
2019-07-09 06:18:54,641 59fb1f80 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/59fb1f80/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3378

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3378/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-5605] Update Beam Java SDK backlog to track latest changes in Beam

[boyuanz] Fix RestrictionTracker docstring

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-5 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 69bb363334fdfe7f812bbb2a0bc210419be0409d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 69bb363334fdfe7f812bbb2a0bc210419be0409d
Commit message: "Merge pull request #8999 from boyuanzz/doc_fix"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1000472315844338794.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6138741344736320784.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1371925352997267855.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4408903589182955469.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5074376128592405277.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4559363158433975691.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8691704614057096790.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-09 00:31:44,776 69484ed9 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/69484ed9/pkb.log>
2019-07-09 00:31:44,777 69484ed9 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-09 00:31:44,778 69484ed9 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-09 00:31:44,930 69484ed9 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-09 00:31:44,953 69484ed9 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-09 00:31:44,975 69484ed9 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-09 00:31:44,978 69484ed9 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-09 00:31:44,979 69484ed9 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-69484ed9 --format json --quiet --project apache-beam-testing
2019-07-09 00:31:46,421 69484ed9 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-69484ed9 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-69484ed9

2019-07-09 00:31:46,422 69484ed9 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-69484ed9 --format json --quiet --project apache-beam-testing
2019-07-09 00:31:46,971 69484ed9 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-69484ed9 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-69484ed9

2019-07-09 00:31:46,973 69484ed9 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-09 00:31:46,974 69484ed9 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-09 00:31:46,974 69484ed9 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-09 00:31:46,974 69484ed9 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/69484ed9/pkb.log>
2019-07-09 00:31:46,974 69484ed9 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/69484ed9/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3377

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3377/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1572655745540198448.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7033420544936491281.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4034071935666701485.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins513789897340774007.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1026137908094033862.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8741266191917317922.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2428793568900699562.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-08 18:19:16,931 b1c8ca3e MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b1c8ca3e/pkb.log>
2019-07-08 18:19:16,931 b1c8ca3e MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-08 18:19:16,933 b1c8ca3e MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-08 18:19:17,254 b1c8ca3e MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-08 18:19:17,277 b1c8ca3e MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-08 18:19:17,300 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-08 18:19:17,303 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-08 18:19:17,304 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-b1c8ca3e --format json --quiet --project apache-beam-testing
2019-07-08 18:19:18,924 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-b1c8ca3e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-b1c8ca3e

2019-07-08 18:19:18,924 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-b1c8ca3e --format json --quiet --project apache-beam-testing
2019-07-08 18:19:19,515 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-b1c8ca3e --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-b1c8ca3e

2019-07-08 18:19:19,518 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-08 18:19:19,518 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-08 18:19:19,518 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-08 18:19:19,519 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b1c8ca3e/pkb.log>
2019-07-08 18:19:19,519 b1c8ca3e MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b1c8ca3e/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3376

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3376/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9074624883214591025.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3444048324219886798.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5751221519756274565.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins679142522489937899.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5235422842008009945.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9056655256944430340.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7926095934863104295.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-08 12:31:30,919 026694d6 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/026694d6/pkb.log>
2019-07-08 12:31:30,919 026694d6 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-08 12:31:30,921 026694d6 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-08 12:31:31,288 026694d6 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-08 12:31:31,311 026694d6 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-08 12:31:31,332 026694d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-08 12:31:31,334 026694d6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-08 12:31:31,335 026694d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-026694d6 --format json --quiet --project apache-beam-testing
2019-07-08 12:31:32,820 026694d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-026694d6 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-026694d6

2019-07-08 12:31:32,821 026694d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-026694d6 --format json --quiet --project apache-beam-testing
2019-07-08 12:31:33,456 026694d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-026694d6 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-026694d6

2019-07-08 12:31:33,459 026694d6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-08 12:31:33,459 026694d6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-08 12:31:33,460 026694d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-08 12:31:33,460 026694d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/026694d6/pkb.log>
2019-07-08 12:31:33,460 026694d6 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/026694d6/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3375

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3375/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4767124888912282001.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2842012486127780723.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6790513617172824073.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8379664245990346776.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5676508018833163458.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins148671008877363397.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5484801030894236630.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-08 06:15:54,097 2c946748 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2c946748/pkb.log>
2019-07-08 06:15:54,097 2c946748 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-08 06:15:54,099 2c946748 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-08 06:15:54,356 2c946748 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-08 06:15:54,378 2c946748 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-08 06:15:54,401 2c946748 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-08 06:15:54,403 2c946748 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-08 06:15:54,405 2c946748 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-2c946748 --format json --quiet --project apache-beam-testing
2019-07-08 06:15:54,946 2c946748 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-2c946748 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-2c946748

2019-07-08 06:15:54,947 2c946748 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-2c946748 --format json --quiet --project apache-beam-testing
2019-07-08 06:15:55,450 2c946748 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-2c946748 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-2c946748

2019-07-08 06:15:55,452 2c946748 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-08 06:15:55,453 2c946748 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-08 06:15:55,453 2c946748 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-08 06:15:55,453 2c946748 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2c946748/pkb.log>
2019-07-08 06:15:55,453 2c946748 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/2c946748/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3374

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3374/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-3 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5789324481615058751.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3881860099849515039.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8066672350581866967.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4298924355298295315.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins39698669539166817.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6363168705700927268.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6045057242959790781.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-08 00:15:25,367 0e78a3f5 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0e78a3f5/pkb.log>
2019-07-08 00:15:25,367 0e78a3f5 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-08 00:15:25,368 0e78a3f5 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-08 00:15:25,604 0e78a3f5 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-08 00:15:25,627 0e78a3f5 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-08 00:15:25,650 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-08 00:15:25,652 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-08 00:15:25,654 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-0e78a3f5 --format json --quiet --project apache-beam-testing
2019-07-08 00:15:27,075 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-0e78a3f5 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0e78a3f5

2019-07-08 00:15:27,075 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-0e78a3f5 --format json --quiet --project apache-beam-testing
2019-07-08 00:15:27,747 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-0e78a3f5 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0e78a3f5

2019-07-08 00:15:27,749 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-08 00:15:27,750 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-08 00:15:27,751 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-08 00:15:27,751 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0e78a3f5/pkb.log>
2019-07-08 00:15:27,751 0e78a3f5 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0e78a3f5/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3373

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3373/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8390834791543771849.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2985593007293409527.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2264059408715852560.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4408314534951727670.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1005885619519460183.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3057175916116576832.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7609658477445649195.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-07 18:15:11,037 758c0ff1 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/758c0ff1/pkb.log>
2019-07-07 18:15:11,037 758c0ff1 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-07 18:15:11,039 758c0ff1 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-07 18:15:11,235 758c0ff1 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-07 18:15:11,258 758c0ff1 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-07 18:15:11,281 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-07 18:15:11,284 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-07 18:15:11,286 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-758c0ff1 --format json --quiet --project apache-beam-testing
2019-07-07 18:15:11,861 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-758c0ff1 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-758c0ff1

2019-07-07 18:15:11,862 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-758c0ff1 --format json --quiet --project apache-beam-testing
2019-07-07 18:15:12,394 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-758c0ff1 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-758c0ff1

2019-07-07 18:15:12,397 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-07 18:15:12,397 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-07 18:15:12,398 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-07 18:15:12,398 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/758c0ff1/pkb.log>
2019-07-07 18:15:12,398 758c0ff1 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/758c0ff1/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3372

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3372/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-5 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2344623944363825012.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3881140040262043074.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7789148915004416319.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1709627145800758157.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2631168679363539705.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2955198412399958468.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5160812355523420398.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-07 12:17:53,275 219f672d MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/219f672d/pkb.log>
2019-07-07 12:17:53,276 219f672d MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-07 12:17:53,277 219f672d MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-07 12:17:53,480 219f672d MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-07 12:17:53,502 219f672d MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-07 12:17:53,522 219f672d MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-07 12:17:53,524 219f672d MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-07 12:17:53,525 219f672d MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-219f672d --format json --quiet --project apache-beam-testing
2019-07-07 12:17:55,074 219f672d MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-219f672d --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-219f672d

2019-07-07 12:17:55,075 219f672d MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-219f672d --format json --quiet --project apache-beam-testing
2019-07-07 12:17:55,595 219f672d MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-219f672d --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-219f672d

2019-07-07 12:17:55,598 219f672d MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-07 12:17:55,599 219f672d MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-07 12:17:55,599 219f672d MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-07 12:17:55,599 219f672d MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/219f672d/pkb.log>
2019-07-07 12:17:55,600 219f672d MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/219f672d/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3371

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3371/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins929398838033218846.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2306891266525535600.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4614968042334249963.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins291913913031788156.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6302710266318114599.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8922334542522903934.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4821368797815486504.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-07 06:15:55,997 b93caf37 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b93caf37/pkb.log>
2019-07-07 06:15:55,998 b93caf37 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-07 06:15:55,999 b93caf37 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-07 06:15:56,250 b93caf37 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-07 06:15:56,272 b93caf37 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-07 06:15:56,295 b93caf37 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-07 06:15:56,298 b93caf37 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-07 06:15:56,299 b93caf37 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-b93caf37 --format json --quiet --project apache-beam-testing
2019-07-07 06:15:56,902 b93caf37 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-b93caf37 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-b93caf37

2019-07-07 06:15:56,904 b93caf37 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-b93caf37 --format json --quiet --project apache-beam-testing
2019-07-07 06:15:57,466 b93caf37 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-b93caf37 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-b93caf37

2019-07-07 06:15:57,468 b93caf37 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-07 06:15:57,469 b93caf37 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-07 06:15:57,469 b93caf37 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-07 06:15:57,469 b93caf37 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b93caf37/pkb.log>
2019-07-07 06:15:57,469 b93caf37 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b93caf37/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3370

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3370/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6030213752250835630.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2465970171000648982.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8960181615894871555.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1705493769924088514.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1611702465328693640.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins272733296007698356.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8457624480169486392.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-07 00:15:46,865 205b66b2 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/205b66b2/pkb.log>
2019-07-07 00:15:46,866 205b66b2 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-07 00:15:46,867 205b66b2 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-07 00:15:47,104 205b66b2 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-07 00:15:47,127 205b66b2 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-07 00:15:47,151 205b66b2 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-07 00:15:47,154 205b66b2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-07 00:15:47,155 205b66b2 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-205b66b2 --format json --quiet --project apache-beam-testing
2019-07-07 00:15:47,843 205b66b2 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-205b66b2 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-205b66b2

2019-07-07 00:15:47,844 205b66b2 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-205b66b2 --format json --quiet --project apache-beam-testing
2019-07-07 00:15:48,424 205b66b2 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-205b66b2 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-205b66b2

2019-07-07 00:15:48,426 205b66b2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-07 00:15:48,427 205b66b2 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-07 00:15:48,427 205b66b2 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-07 00:15:48,428 205b66b2 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/205b66b2/pkb.log>
2019-07-07 00:15:48,428 205b66b2 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/205b66b2/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3369

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3369/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2115054053855925705.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6884022508325250336.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2338589089678190289.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2427235839031031750.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8424533229835993421.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5429193174190917933.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8158838618852888274.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-06 18:20:23,265 a782d252 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a782d252/pkb.log>
2019-07-06 18:20:23,266 a782d252 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-06 18:20:23,268 a782d252 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-06 18:20:23,881 a782d252 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-06 18:20:23,940 a782d252 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-06 18:20:23,992 a782d252 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-06 18:20:23,996 a782d252 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-06 18:20:23,998 a782d252 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-a782d252 --format json --quiet --project apache-beam-testing
2019-07-06 18:20:26,789 a782d252 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-a782d252 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-a782d252

2019-07-06 18:20:26,790 a782d252 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-a782d252 --format json --quiet --project apache-beam-testing
2019-07-06 18:20:28,573 a782d252 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-a782d252 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-a782d252

2019-07-06 18:20:28,576 a782d252 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-06 18:20:28,577 a782d252 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-06 18:20:28,577 a782d252 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-06 18:20:28,577 a782d252 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a782d252/pkb.log>
2019-07-06 18:20:28,578 a782d252 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a782d252/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3368

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3368/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8609153729188206707.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8351008255821376776.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2751465873616526376.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins81106231684272897.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins764673763352199010.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3262511241361620200.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1773581976886225705.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-06 12:18:17,361 4dbac8fe MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4dbac8fe/pkb.log>
2019-07-06 12:18:17,362 4dbac8fe MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-06 12:18:17,363 4dbac8fe MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-06 12:18:17,548 4dbac8fe MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-06 12:18:17,571 4dbac8fe MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-06 12:18:17,593 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-06 12:18:17,595 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-06 12:18:17,596 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-4dbac8fe --format json --quiet --project apache-beam-testing
2019-07-06 12:18:18,169 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-4dbac8fe --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-4dbac8fe

2019-07-06 12:18:18,170 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-4dbac8fe --format json --quiet --project apache-beam-testing
2019-07-06 12:18:18,685 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-4dbac8fe --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-4dbac8fe

2019-07-06 12:18:18,688 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-06 12:18:18,688 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-06 12:18:18,688 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-06 12:18:18,689 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4dbac8fe/pkb.log>
2019-07-06 12:18:18,689 4dbac8fe MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4dbac8fe/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3367

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3367/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7971991099341953669.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5057814181831934083.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2030914960679900311.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8353986790543555202.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5988325045783628233.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1466540325054713581.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1897432888231672101.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-06 06:15:52,737 36b7694b MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/36b7694b/pkb.log>
2019-07-06 06:15:52,737 36b7694b MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-06 06:15:52,738 36b7694b MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-06 06:15:52,996 36b7694b MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-06 06:15:53,019 36b7694b MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-06 06:15:53,041 36b7694b MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-06 06:15:53,043 36b7694b MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-06 06:15:53,044 36b7694b MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-36b7694b --format json --quiet --project apache-beam-testing
2019-07-06 06:15:53,621 36b7694b MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-36b7694b --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-36b7694b

2019-07-06 06:15:53,622 36b7694b MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-36b7694b --format json --quiet --project apache-beam-testing
2019-07-06 06:15:54,133 36b7694b MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-36b7694b --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-36b7694b

2019-07-06 06:15:54,136 36b7694b MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-06 06:15:54,136 36b7694b MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-06 06:15:54,136 36b7694b MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-06 06:15:54,136 36b7694b MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/36b7694b/pkb.log>
2019-07-06 06:15:54,136 36b7694b MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/36b7694b/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3366

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3366/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8891126111870215307.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7058014445850070305.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5055348952126490194.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5274803347257474564.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1472087819157264269.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5377905465331031208.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1708877458100796816.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-06 00:17:11,650 0b184d8d MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0b184d8d/pkb.log>
2019-07-06 00:17:11,651 0b184d8d MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-06 00:17:11,653 0b184d8d MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-06 00:17:11,918 0b184d8d MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-06 00:17:11,941 0b184d8d MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-06 00:17:11,966 0b184d8d MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-06 00:17:11,969 0b184d8d MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-06 00:17:11,970 0b184d8d MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-0b184d8d --format json --quiet --project apache-beam-testing
2019-07-06 00:17:12,572 0b184d8d MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-0b184d8d --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0b184d8d

2019-07-06 00:17:12,573 0b184d8d MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-0b184d8d --format json --quiet --project apache-beam-testing
2019-07-06 00:17:13,339 0b184d8d MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-0b184d8d --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0b184d8d

2019-07-06 00:17:13,342 0b184d8d MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-06 00:17:13,343 0b184d8d MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-06 00:17:13,343 0b184d8d MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-06 00:17:13,343 0b184d8d MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0b184d8d/pkb.log>
2019-07-06 00:17:13,344 0b184d8d MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0b184d8d/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3365

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3365/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5389048184966245682.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4616099428672746698.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4673257486342201886.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8531210336755444536.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins617248644765090898.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5693782505389984182.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9005004905732186790.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-05 18:18:54,354 a0cf1b7c MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a0cf1b7c/pkb.log>
2019-07-05 18:18:54,355 a0cf1b7c MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-05 18:18:54,356 a0cf1b7c MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-05 18:18:54,541 a0cf1b7c MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-05 18:18:54,564 a0cf1b7c MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-05 18:18:54,586 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-05 18:18:54,588 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-05 18:18:54,589 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-a0cf1b7c --format json --quiet --project apache-beam-testing
2019-07-05 18:18:55,132 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-a0cf1b7c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-a0cf1b7c

2019-07-05 18:18:55,133 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-a0cf1b7c --format json --quiet --project apache-beam-testing
2019-07-05 18:18:55,659 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-a0cf1b7c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-a0cf1b7c

2019-07-05 18:18:55,662 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-05 18:18:55,662 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-05 18:18:55,663 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-05 18:18:55,663 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a0cf1b7c/pkb.log>
2019-07-05 18:18:55,663 a0cf1b7c MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/a0cf1b7c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3364

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3364/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8260210808292510975.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5177030636991118302.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4489617675826494724.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2844473267462406546.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6853589854156151827.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5596240117813387154.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins50991129774526581.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-05 12:19:00,108 4c7d09bc MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4c7d09bc/pkb.log>
2019-07-05 12:19:00,109 4c7d09bc MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-05 12:19:00,110 4c7d09bc MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-05 12:19:00,341 4c7d09bc MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-05 12:19:00,365 4c7d09bc MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-05 12:19:00,388 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-05 12:19:00,390 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-05 12:19:00,392 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-4c7d09bc --format json --quiet --project apache-beam-testing
2019-07-05 12:19:02,092 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-4c7d09bc --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-4c7d09bc

2019-07-05 12:19:02,093 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-4c7d09bc --format json --quiet --project apache-beam-testing
2019-07-05 12:19:02,719 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-4c7d09bc --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-4c7d09bc

2019-07-05 12:19:02,722 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-05 12:19:02,722 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-05 12:19:02,722 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-05 12:19:02,723 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4c7d09bc/pkb.log>
2019-07-05 12:19:02,723 4c7d09bc MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/4c7d09bc/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3363

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3363/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7764408002913506085.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8665943265164031471.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5211933807397349723.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4940683504647056364.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5134987354322756080.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8931627492286583469.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6000218178112738753.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-05 06:17:31,704 ca6a0e82 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/ca6a0e82/pkb.log>
2019-07-05 06:17:31,704 ca6a0e82 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-05 06:17:31,706 ca6a0e82 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-05 06:17:31,903 ca6a0e82 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-05 06:17:31,925 ca6a0e82 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-05 06:17:31,946 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-05 06:17:31,949 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-05 06:17:31,950 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-ca6a0e82 --format json --quiet --project apache-beam-testing
2019-07-05 06:17:32,493 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-ca6a0e82 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-ca6a0e82

2019-07-05 06:17:32,494 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-ca6a0e82 --format json --quiet --project apache-beam-testing
2019-07-05 06:17:33,016 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-ca6a0e82 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-ca6a0e82

2019-07-05 06:17:33,019 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-05 06:17:33,019 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-05 06:17:33,020 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-05 06:17:33,020 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/ca6a0e82/pkb.log>
2019-07-05 06:17:33,020 ca6a0e82 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/ca6a0e82/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3362

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3362/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5820386917874271928.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8957234860284485154.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3388625017645766661.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3325407793479119539.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4556322666533503940.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8316605646688541728.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2478346006275625578.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-05 00:15:37,171 edb7c6f6 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/edb7c6f6/pkb.log>
2019-07-05 00:15:37,173 edb7c6f6 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-05 00:15:37,174 edb7c6f6 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-05 00:15:37,420 edb7c6f6 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-05 00:15:37,442 edb7c6f6 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-05 00:15:37,465 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-05 00:15:37,468 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-05 00:15:37,469 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-edb7c6f6 --format json --quiet --project apache-beam-testing
2019-07-05 00:15:38,019 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-edb7c6f6 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-edb7c6f6

2019-07-05 00:15:38,019 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-edb7c6f6 --format json --quiet --project apache-beam-testing
2019-07-05 00:15:38,498 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-edb7c6f6 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-edb7c6f6

2019-07-05 00:15:38,500 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-05 00:15:38,500 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-05 00:15:38,501 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-05 00:15:38,501 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/edb7c6f6/pkb.log>
2019-07-05 00:15:38,501 edb7c6f6 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/edb7c6f6/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3361

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3361/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1829565837598343962.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4795694736657771121.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1926483106826525336.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4825254443460632950.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2989515872484285247.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5894278990412353134.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins914761921726567923.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-04 18:16:04,426 d695b4b4 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d695b4b4/pkb.log>
2019-07-04 18:16:04,427 d695b4b4 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-04 18:16:04,428 d695b4b4 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-04 18:16:04,646 d695b4b4 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-04 18:16:04,673 d695b4b4 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-04 18:16:04,695 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-04 18:16:04,697 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-04 18:16:04,698 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-d695b4b4 --format json --quiet --project apache-beam-testing
2019-07-04 18:16:05,623 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-d695b4b4 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-d695b4b4

2019-07-04 18:16:05,624 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-d695b4b4 --format json --quiet --project apache-beam-testing
2019-07-04 18:16:06,086 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-d695b4b4 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-d695b4b4

2019-07-04 18:16:06,088 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-04 18:16:06,088 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-04 18:16:06,089 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-04 18:16:06,089 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d695b4b4/pkb.log>
2019-07-04 18:16:06,089 d695b4b4 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/d695b4b4/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3360

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3360/display/redirect?page=changes>

Changes:

[jozsi] Add Jet Runner to the Get Started page

[iemejia] [BEAM-7682] Fix Combine.GroupedValues javadoc code snippet

[cyturel] [BEAM-7683] - fix withQueryFn when split is more than 0

[kamil.wasilewski] [BEAM-7550] Reimplement Python ParDo load test according to the proposal

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2d5e493abf39ee6fc89831bb0b7ec9fee592b9c5
Commit message: "Merge pull request #8847: [BEAM-7550] Missing pipeline parameters in ParDo Load Test"
 > git rev-list --no-walk 32e2e3e910619d400568073dde0d7b36698a416a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8440275446116181894.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2932686297425543397.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins567284088825974641.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3609352261217823321.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6144149032211256167.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2042271593486512091.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3572617600087477984.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-04 12:20:20,609 82236299 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/82236299/pkb.log>
2019-07-04 12:20:20,610 82236299 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-04 12:20:20,611 82236299 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-04 12:20:20,785 82236299 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-04 12:20:20,807 82236299 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-04 12:20:20,828 82236299 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-04 12:20:20,831 82236299 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-04 12:20:20,832 82236299 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-82236299 --format json --quiet --project apache-beam-testing
2019-07-04 12:20:21,331 82236299 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-82236299 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-82236299

2019-07-04 12:20:21,332 82236299 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-82236299 --format json --quiet --project apache-beam-testing
2019-07-04 12:20:21,799 82236299 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-82236299 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-82236299

2019-07-04 12:20:21,802 82236299 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-04 12:20:21,802 82236299 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-04 12:20:21,802 82236299 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-04 12:20:21,803 82236299 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/82236299/pkb.log>
2019-07-04 12:20:21,803 82236299 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/82236299/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3359

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3359/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 32e2e3e910619d400568073dde0d7b36698a416a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 32e2e3e910619d400568073dde0d7b36698a416a
Commit message: "Merge pull request #8986: [BEAM-4420] Add KafkaIO Integration Tests"
 > git rev-list --no-walk 32e2e3e910619d400568073dde0d7b36698a416a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2585243015906754866.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2778230604237467608.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6859198406218344215.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5115904181679371973.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5905434654083321888.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4611086875271509521.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8247692120776181136.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-04 06:16:31,411 27c34ac4 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/27c34ac4/pkb.log>
2019-07-04 06:16:31,411 27c34ac4 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-04 06:16:31,413 27c34ac4 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-04 06:16:31,690 27c34ac4 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-04 06:16:31,713 27c34ac4 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-04 06:16:31,735 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-04 06:16:31,738 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-04 06:16:31,739 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-27c34ac4 --format json --quiet --project apache-beam-testing
2019-07-04 06:16:33,521 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-27c34ac4 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-27c34ac4

2019-07-04 06:16:33,522 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-27c34ac4 --format json --quiet --project apache-beam-testing
2019-07-04 06:16:34,137 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-27c34ac4 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-27c34ac4

2019-07-04 06:16:34,140 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-04 06:16:34,140 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-04 06:16:34,140 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-04 06:16:34,141 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/27c34ac4/pkb.log>
2019-07-04 06:16:34,141 27c34ac4 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/27c34ac4/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3358

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3358/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 32e2e3e910619d400568073dde0d7b36698a416a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 32e2e3e910619d400568073dde0d7b36698a416a
Commit message: "Merge pull request #8986: [BEAM-4420] Add KafkaIO Integration Tests"
 > git rev-list --no-walk 32e2e3e910619d400568073dde0d7b36698a416a # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7205567890318657780.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8169579288382645771.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins22352802717849191.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5775416290427381974.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins495233122032891940.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6452865396170771076.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2642209257971788837.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-04 00:16:22,219 946eec76 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/946eec76/pkb.log>
2019-07-04 00:16:22,220 946eec76 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-04 00:16:22,222 946eec76 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-04 00:16:22,480 946eec76 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-04 00:16:22,504 946eec76 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-04 00:16:22,529 946eec76 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-04 00:16:22,531 946eec76 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-04 00:16:22,533 946eec76 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-946eec76 --format json --quiet --project apache-beam-testing
2019-07-04 00:16:24,073 946eec76 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-946eec76 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-946eec76

2019-07-04 00:16:24,074 946eec76 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-946eec76 --format json --quiet --project apache-beam-testing
2019-07-04 00:16:24,727 946eec76 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-946eec76 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-946eec76

2019-07-04 00:16:24,730 946eec76 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-04 00:16:24,731 946eec76 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-04 00:16:24,731 946eec76 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-04 00:16:24,731 946eec76 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/946eec76/pkb.log>
2019-07-04 00:16:24,732 946eec76 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/946eec76/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3357

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3357/display/redirect?page=changes>

Changes:

[lukasz.gajowy] [BEAM-4420] Allow connecting to zookeeper using external ip

[lukasz.gajowy] [BEAM-4420] Add KafkaIO integration test pipeline

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 32e2e3e910619d400568073dde0d7b36698a416a (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 32e2e3e910619d400568073dde0d7b36698a416a
Commit message: "Merge pull request #8986: [BEAM-4420] Add KafkaIO Integration Tests"
 > git rev-list --no-walk b79f24ced1c8519c29443ea7109c59ad18be2ebe # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins316557230694026971.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8992653506107324849.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1836732817999248169.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6965427551458657659.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9151899837014244011.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins810084778099168960.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5051882002597372449.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-03 18:29:54,599 67359fe9 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/67359fe9/pkb.log>
2019-07-03 18:29:54,599 67359fe9 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-03 18:29:54,601 67359fe9 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-03 18:29:54,871 67359fe9 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-03 18:29:54,893 67359fe9 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-03 18:29:54,914 67359fe9 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-03 18:29:54,917 67359fe9 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-03 18:29:54,918 67359fe9 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-67359fe9 --format json --quiet --project apache-beam-testing
2019-07-03 18:29:56,475 67359fe9 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-67359fe9 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-67359fe9

2019-07-03 18:29:56,476 67359fe9 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-67359fe9 --format json --quiet --project apache-beam-testing
2019-07-03 18:29:57,072 67359fe9 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-67359fe9 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-67359fe9

2019-07-03 18:29:57,075 67359fe9 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-03 18:29:57,075 67359fe9 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-03 18:29:57,076 67359fe9 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-03 18:29:57,076 67359fe9 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/67359fe9/pkb.log>
2019-07-03 18:29:57,076 67359fe9 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/67359fe9/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3356

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3356/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-6 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b79f24ced1c8519c29443ea7109c59ad18be2ebe (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b79f24ced1c8519c29443ea7109c59ad18be2ebe
Commit message: "Merge pull request #8878: [BEAM-5315] improve test coverage bigquery special chars"
 > git rev-list --no-walk b79f24ced1c8519c29443ea7109c59ad18be2ebe # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2952804965615115829.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2277051219805238145.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2358384234985072352.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5958293931802106629.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5321218550775960547.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3742353148527829190.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6225065307747162451.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-03 12:19:55,934 7195b51f MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7195b51f/pkb.log>
2019-07-03 12:19:55,934 7195b51f MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-03 12:19:55,935 7195b51f MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-03 12:19:56,091 7195b51f MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-03 12:19:56,113 7195b51f MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-03 12:19:56,135 7195b51f MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-03 12:19:56,138 7195b51f MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-03 12:19:56,139 7195b51f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-7195b51f --format json --quiet --project apache-beam-testing
2019-07-03 12:19:57,693 7195b51f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-7195b51f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-7195b51f

2019-07-03 12:19:57,694 7195b51f MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-7195b51f --format json --quiet --project apache-beam-testing
2019-07-03 12:19:58,314 7195b51f MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-7195b51f --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-7195b51f

2019-07-03 12:19:58,317 7195b51f MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-03 12:19:58,317 7195b51f MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-03 12:19:58,318 7195b51f MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-03 12:19:58,318 7195b51f MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7195b51f/pkb.log>
2019-07-03 12:19:58,318 7195b51f MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/7195b51f/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3355

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3355/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b79f24ced1c8519c29443ea7109c59ad18be2ebe (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b79f24ced1c8519c29443ea7109c59ad18be2ebe
Commit message: "Merge pull request #8878: [BEAM-5315] improve test coverage bigquery special chars"
 > git rev-list --no-walk b79f24ced1c8519c29443ea7109c59ad18be2ebe # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3779842263893336652.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1618060613872287261.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1810715632217842977.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8012975374043693558.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3776736198328387627.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5040810929208951830.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins519225758109114468.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-03 06:15:55,518 0a61e412 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0a61e412/pkb.log>
2019-07-03 06:15:55,519 0a61e412 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-03 06:15:55,520 0a61e412 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-03 06:15:55,681 0a61e412 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-03 06:15:55,701 0a61e412 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-03 06:15:55,721 0a61e412 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-03 06:15:55,723 0a61e412 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-03 06:15:55,724 0a61e412 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-0a61e412 --format json --quiet --project apache-beam-testing
2019-07-03 06:15:57,235 0a61e412 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-0a61e412 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0a61e412

2019-07-03 06:15:57,236 0a61e412 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-0a61e412 --format json --quiet --project apache-beam-testing
2019-07-03 06:15:57,757 0a61e412 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-0a61e412 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-0a61e412

2019-07-03 06:15:57,780 0a61e412 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-03 06:15:57,780 0a61e412 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-03 06:15:57,781 0a61e412 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-03 06:15:57,781 0a61e412 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0a61e412/pkb.log>
2019-07-03 06:15:57,781 0a61e412 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/0a61e412/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3354

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3354/display/redirect?page=changes>

Changes:

[juta.staes] [BEAM-5315] improve test coverage bigquery special chars

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-4 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b79f24ced1c8519c29443ea7109c59ad18be2ebe (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b79f24ced1c8519c29443ea7109c59ad18be2ebe
Commit message: "Merge pull request #8878: [BEAM-5315] improve test coverage bigquery special chars"
 > git rev-list --no-walk 7f8359940502ea3b9b88ff429807989179417c5c # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6428002563974983113.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6944504116421309043.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4849794166209738167.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7439953415481384848.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7544981297080041796.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7514716576659709826.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8826012474638375464.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-03 00:20:42,347 e7b0d0c7 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/e7b0d0c7/pkb.log>
2019-07-03 00:20:42,348 e7b0d0c7 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1254-g24e348f
2019-07-03 00:20:42,349 e7b0d0c7 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-03 00:20:42,824 e7b0d0c7 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-03 00:20:42,846 e7b0d0c7 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-03 00:20:42,869 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-03 00:20:42,871 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-03 00:20:42,872 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-e7b0d0c7 --format json --quiet --project apache-beam-testing
2019-07-03 00:20:43,541 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-e7b0d0c7 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-e7b0d0c7

2019-07-03 00:20:43,541 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-e7b0d0c7 --format json --quiet --project apache-beam-testing
2019-07-03 00:20:44,121 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-e7b0d0c7 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-e7b0d0c7

2019-07-03 00:20:44,123 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-03 00:20:44,124 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-03 00:20:44,124 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-03 00:20:44,124 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/e7b0d0c7/pkb.log>
2019-07-03 00:20:44,125 e7b0d0c7 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/e7b0d0c7/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3353

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3353/display/redirect?page=changes>

Changes:

[daniel.o.programmer] Update python containers to beam-master-20190605

[kamil.wasilewski] [BEAM-7504] Added top_count parameter

[kamil.wasilewski] [BEAM-7504] Create Combine Python Load Test Jenkins job

[iemejia] [BEAM-7640] Rename the package name for amazon-web-services2 from aws to

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7f8359940502ea3b9b88ff429807989179417c5c (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7f8359940502ea3b9b88ff429807989179417c5c
Commit message: "Merge pull request #8813: [BEAM-7504] Created Combine Python Load Test Jenkins job"
 > git rev-list --no-walk 6aceab45906e79c7f136ea236e5b120f123b599b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1541396703234181367.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5211972955607715728.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1827246677908344459.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins953725917972485016.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6739629329973838550.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5203278357719177859.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins987219891384632041.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-02 18:24:26,643 5a78a6bd MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5a78a6bd/pkb.log>
2019-07-02 18:24:26,644 5a78a6bd MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-07-02 18:24:26,645 5a78a6bd MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-02 18:24:26,916 5a78a6bd MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-02 18:24:26,938 5a78a6bd MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-02 18:24:26,959 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-02 18:24:26,961 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-02 18:24:26,962 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-5a78a6bd --format json --quiet --project apache-beam-testing
2019-07-02 18:24:28,590 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-5a78a6bd --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-5a78a6bd

2019-07-02 18:24:28,590 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-5a78a6bd --format json --quiet --project apache-beam-testing
2019-07-02 18:24:29,164 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-5a78a6bd --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-5a78a6bd

2019-07-02 18:24:29,167 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-02 18:24:29,167 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-02 18:24:29,167 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-02 18:24:29,167 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5a78a6bd/pkb.log>
2019-07-02 18:24:29,168 5a78a6bd MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5a78a6bd/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3352

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3352/display/redirect?page=changes>

Changes:

[kamil.wasilewski] [BEAM-7536] Fixed BQ dataset name in collecting Load Tests metrics

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 6aceab45906e79c7f136ea236e5b120f123b599b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6aceab45906e79c7f136ea236e5b120f123b599b
Commit message: "Merge pull request #8834: [BEAM-7536] Fix BigQuery dataset name in collecting Load Tests metrics"
 > git rev-list --no-walk 05900a43de1f380f124c688cd8d804b5133bda64 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins584117052940567022.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2469021042014566013.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins624472956036136597.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5803850812026446748.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5841844736609542929.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins664844803157762780.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2478825472098459216.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-02 12:59:19,296 08f4b9b0 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/08f4b9b0/pkb.log>
2019-07-02 12:59:19,297 08f4b9b0 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-07-02 12:59:19,298 08f4b9b0 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-02 12:59:19,913 08f4b9b0 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-02 12:59:19,935 08f4b9b0 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-02 12:59:19,956 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-02 12:59:19,958 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-02 12:59:19,960 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-08f4b9b0 --format json --quiet --project apache-beam-testing
2019-07-02 12:59:21,265 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-08f4b9b0 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-08f4b9b0

2019-07-02 12:59:21,266 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-08f4b9b0 --format json --quiet --project apache-beam-testing
2019-07-02 12:59:21,828 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-08f4b9b0 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-08f4b9b0

2019-07-02 12:59:21,830 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-02 12:59:21,831 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-02 12:59:21,831 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-02 12:59:21,831 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/08f4b9b0/pkb.log>
2019-07-02 12:59:21,831 08f4b9b0 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/08f4b9b0/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3351

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3351/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 05900a43de1f380f124c688cd8d804b5133bda64 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 05900a43de1f380f124c688cd8d804b5133bda64
Commit message: "Merge pull request #8974 from youngoli/patch-7"
 > git rev-list --no-walk 05900a43de1f380f124c688cd8d804b5133bda64 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins767486863644277943.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5614224059408433932.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1167174391010007470.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5308580895084582496.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3033913770875011960.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3366162635440336189.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins798063794178957998.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-02 06:19:59,076 1eda7f9c MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1eda7f9c/pkb.log>
2019-07-02 06:19:59,077 1eda7f9c MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-07-02 06:19:59,079 1eda7f9c MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-02 06:19:59,371 1eda7f9c MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-02 06:19:59,394 1eda7f9c MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-02 06:19:59,416 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-02 06:19:59,419 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-02 06:19:59,420 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-1eda7f9c --format json --quiet --project apache-beam-testing
2019-07-02 06:20:01,324 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-1eda7f9c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-1eda7f9c

2019-07-02 06:20:01,325 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-1eda7f9c --format json --quiet --project apache-beam-testing
2019-07-02 06:20:01,849 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-1eda7f9c --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-1eda7f9c

2019-07-02 06:20:01,851 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-02 06:20:01,852 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-02 06:20:01,852 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-02 06:20:01,852 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1eda7f9c/pkb.log>
2019-07-02 06:20:01,852 1eda7f9c MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1eda7f9c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3350

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3350/display/redirect?page=changes>

Changes:

[hannahjiang] BEAM-3645 add ParallelBundleProcessor

[hannahjiang] BEAM-3645 reflect comments

[hannahjiang] BEAM-3645 add changes from review comments

[hannahjiang] BEAM-3645 add thread lock when generating process_bundle_id

[github] Tiny typo fix

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 05900a43de1f380f124c688cd8d804b5133bda64 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 05900a43de1f380f124c688cd8d804b5133bda64
Commit message: "Merge pull request #8974 from youngoli/patch-7"
 > git rev-list --no-walk 3d576f7c4f86c0b12bd7b682fe2816fbd1137bbe # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3411076288975184145.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3472688011125621118.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4087252318896501643.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1679649489218661757.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4781970420537357152.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8633789941713962404.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3523640305326873898.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-02 00:31:22,374 27b71297 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/27b71297/pkb.log>
2019-07-02 00:31:22,375 27b71297 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-07-02 00:31:22,376 27b71297 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-02 00:31:22,560 27b71297 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-02 00:31:22,584 27b71297 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-02 00:31:22,608 27b71297 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-02 00:31:22,611 27b71297 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-02 00:31:22,612 27b71297 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-27b71297 --format json --quiet --project apache-beam-testing
2019-07-02 00:31:24,193 27b71297 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-27b71297 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-27b71297

2019-07-02 00:31:24,194 27b71297 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-27b71297 --format json --quiet --project apache-beam-testing
2019-07-02 00:31:24,865 27b71297 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-27b71297 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-27b71297

2019-07-02 00:31:24,868 27b71297 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-02 00:31:24,868 27b71297 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-02 00:31:24,869 27b71297 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-02 00:31:24,869 27b71297 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/27b71297/pkb.log>
2019-07-02 00:31:24,869 27b71297 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/27b71297/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3349

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3349/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-7640] Create amazon-web-services2 module and AwsOptions

[alireza4263] [BEAM-7545] Adding RowCount to TextTable.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3d576f7c4f86c0b12bd7b682fe2816fbd1137bbe (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3d576f7c4f86c0b12bd7b682fe2816fbd1137bbe
Commit message: "Merge pull request #8951 from riazela/TextTableRowCount"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6283829795129054928.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7062379720807294402.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2594773629236777516.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6147972063693928836.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5568703813796481513.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3648896062996808346.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins786547062320698551.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-01 18:59:27,228 1a86b821 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1a86b821/pkb.log>
2019-07-01 18:59:27,228 1a86b821 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-07-01 18:59:27,229 1a86b821 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-01 18:59:27,917 1a86b821 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-01 18:59:27,941 1a86b821 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-01 18:59:27,964 1a86b821 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-01 18:59:27,967 1a86b821 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-01 18:59:27,968 1a86b821 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-1a86b821 --format json --quiet --project apache-beam-testing
2019-07-01 18:59:29,372 1a86b821 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-1a86b821 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-1a86b821

2019-07-01 18:59:29,373 1a86b821 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-1a86b821 --format json --quiet --project apache-beam-testing
2019-07-01 18:59:29,967 1a86b821 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-1a86b821 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-1a86b821

2019-07-01 18:59:29,970 1a86b821 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-01 18:59:29,970 1a86b821 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-01 18:59:29,971 1a86b821 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-01 18:59:29,971 1a86b821 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1a86b821/pkb.log>
2019-07-01 18:59:29,971 1a86b821 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/1a86b821/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3348

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3348/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-12 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5778798614615018955.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3478084436616858654.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5191244587732898733.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4007421192084958404.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2332480716286992613.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5375851521636472966.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1194376823278408333.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-01 12:26:31,185 00b30740 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/00b30740/pkb.log>
2019-07-01 12:26:31,185 00b30740 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-07-01 12:26:31,187 00b30740 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-01 12:26:31,454 00b30740 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-01 12:26:31,477 00b30740 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-01 12:26:31,499 00b30740 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-01 12:26:31,501 00b30740 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-01 12:26:31,503 00b30740 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-00b30740 --format json --quiet --project apache-beam-testing
2019-07-01 12:26:32,890 00b30740 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-00b30740 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-00b30740

2019-07-01 12:26:32,891 00b30740 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-00b30740 --format json --quiet --project apache-beam-testing
2019-07-01 12:26:33,480 00b30740 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-00b30740 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-00b30740

2019-07-01 12:26:33,483 00b30740 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-01 12:26:33,483 00b30740 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-01 12:26:33,484 00b30740 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-01 12:26:33,484 00b30740 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/00b30740/pkb.log>
2019-07-01 12:26:33,484 00b30740 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/00b30740/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3347

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3347/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6923141689232548892.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5502896366338717101.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4154434505928987851.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5585581588904283694.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7508909419944227898.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7498123042003207095.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4566877834450688978.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-01 06:15:50,920 fe1ae5e7 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/fe1ae5e7/pkb.log>
2019-07-01 06:15:50,920 fe1ae5e7 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-07-01 06:15:50,921 fe1ae5e7 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-01 06:15:51,267 fe1ae5e7 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-01 06:15:51,289 fe1ae5e7 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-01 06:15:51,309 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-01 06:15:51,312 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-01 06:15:51,313 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-fe1ae5e7 --format json --quiet --project apache-beam-testing
2019-07-01 06:15:51,875 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-fe1ae5e7 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-fe1ae5e7

2019-07-01 06:15:51,876 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-fe1ae5e7 --format json --quiet --project apache-beam-testing
2019-07-01 06:15:52,407 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-fe1ae5e7 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-fe1ae5e7

2019-07-01 06:15:52,410 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-01 06:15:52,410 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-01 06:15:52,410 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-01 06:15:52,411 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/fe1ae5e7/pkb.log>
2019-07-01 06:15:52,411 fe1ae5e7 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/fe1ae5e7/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3346

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3346/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7733893354921916232.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8787417450566270689.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8779576056220494634.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5770660546156434370.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins185782146958292415.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3963465274591749057.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2013520203784257810.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-07-01 00:15:51,981 63c063e3 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/63c063e3/pkb.log>
2019-07-01 00:15:51,981 63c063e3 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-07-01 00:15:51,982 63c063e3 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-07-01 00:15:52,148 63c063e3 MainThread INFO     Setting --max_concurrent_threads=200.
2019-07-01 00:15:52,170 63c063e3 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-07-01 00:15:52,197 63c063e3 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-07-01 00:15:52,200 63c063e3 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-01 00:15:52,201 63c063e3 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-63c063e3 --format json --quiet --project apache-beam-testing
2019-07-01 00:15:52,763 63c063e3 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-63c063e3 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-63c063e3

2019-07-01 00:15:52,763 63c063e3 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-63c063e3 --format json --quiet --project apache-beam-testing
2019-07-01 00:15:53,282 63c063e3 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-63c063e3 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-63c063e3

2019-07-01 00:15:53,285 63c063e3 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-07-01 00:15:53,285 63c063e3 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-07-01 00:15:53,286 63c063e3 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-07-01 00:15:53,286 63c063e3 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/63c063e3/pkb.log>
2019-07-01 00:15:53,286 63c063e3 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/63c063e3/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3345

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3345/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1559956219365046954.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4400877742371224523.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1972601263673395253.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5484487916984316042.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7935412155023929307.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8078788736427467644.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2877030011885637318.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-06-30 18:18:36,489 60eb3aed MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/60eb3aed/pkb.log>
2019-06-30 18:18:36,490 60eb3aed MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-06-30 18:18:36,491 60eb3aed MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-06-30 18:18:36,784 60eb3aed MainThread INFO     Setting --max_concurrent_threads=200.
2019-06-30 18:18:36,806 60eb3aed MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-06-30 18:18:36,828 60eb3aed MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-06-30 18:18:36,830 60eb3aed MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-30 18:18:36,832 60eb3aed MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-60eb3aed --format json --quiet --project apache-beam-testing
2019-06-30 18:18:38,438 60eb3aed MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-60eb3aed --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-60eb3aed

2019-06-30 18:18:38,439 60eb3aed MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-60eb3aed --format json --quiet --project apache-beam-testing
2019-06-30 18:18:38,978 60eb3aed MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-60eb3aed --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-60eb3aed

2019-06-30 18:18:38,980 60eb3aed MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-30 18:18:38,981 60eb3aed MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-06-30 18:18:38,981 60eb3aed MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-06-30 18:18:38,981 60eb3aed MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/60eb3aed/pkb.log>
2019-06-30 18:18:38,981 60eb3aed MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/60eb3aed/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3344

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3344/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-7 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2925490934349301120.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3815238636383786502.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2352378369427798238.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1702037023364459699.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4399330724293367502.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7605286494928608082.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7555170631833373223.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-06-30 12:19:48,150 79af7f7a MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/79af7f7a/pkb.log>
2019-06-30 12:19:48,151 79af7f7a MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-06-30 12:19:48,152 79af7f7a MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-06-30 12:19:48,339 79af7f7a MainThread INFO     Setting --max_concurrent_threads=200.
2019-06-30 12:19:48,362 79af7f7a MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-06-30 12:19:48,384 79af7f7a MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-06-30 12:19:48,386 79af7f7a MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-30 12:19:48,388 79af7f7a MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-79af7f7a --format json --quiet --project apache-beam-testing
2019-06-30 12:19:50,079 79af7f7a MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-79af7f7a --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-79af7f7a

2019-06-30 12:19:50,080 79af7f7a MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-79af7f7a --format json --quiet --project apache-beam-testing
2019-06-30 12:19:50,699 79af7f7a MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-79af7f7a --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-79af7f7a

2019-06-30 12:19:50,702 79af7f7a MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-30 12:19:50,702 79af7f7a MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-06-30 12:19:50,703 79af7f7a MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-06-30 12:19:50,703 79af7f7a MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/79af7f7a/pkb.log>
2019-06-30 12:19:50,703 79af7f7a MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/79af7f7a/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3343

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3343/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-5 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5426644701542338448.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3279249546367053847.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9082971254921680858.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8159350576891814481.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8403909381601303987.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8041057879705856226.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins8779721261138962653.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-06-30 06:18:03,326 897097b1 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/897097b1/pkb.log>
2019-06-30 06:18:03,326 897097b1 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-06-30 06:18:03,327 897097b1 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-06-30 06:18:03,492 897097b1 MainThread INFO     Setting --max_concurrent_threads=200.
2019-06-30 06:18:03,514 897097b1 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-06-30 06:18:03,538 897097b1 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-06-30 06:18:03,540 897097b1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-30 06:18:03,542 897097b1 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-897097b1 --format json --quiet --project apache-beam-testing
2019-06-30 06:18:05,234 897097b1 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-897097b1 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-897097b1

2019-06-30 06:18:05,235 897097b1 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-897097b1 --format json --quiet --project apache-beam-testing
2019-06-30 06:18:05,777 897097b1 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-897097b1 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-897097b1

2019-06-30 06:18:05,779 897097b1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-30 06:18:05,780 897097b1 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-06-30 06:18:05,781 897097b1 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-06-30 06:18:05,781 897097b1 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/897097b1/pkb.log>
2019-06-30 06:18:05,781 897097b1 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/897097b1/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3342

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3342/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3961699331527995165.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5170250999749897875.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4981241611344289865.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9064436445717893058.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins689432746611191955.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1008003141351316559.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4871988981347644454.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-06-30 00:16:17,051 f9ca0c6d MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/f9ca0c6d/pkb.log>
2019-06-30 00:16:17,051 f9ca0c6d MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-06-30 00:16:17,053 f9ca0c6d MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-06-30 00:16:17,289 f9ca0c6d MainThread INFO     Setting --max_concurrent_threads=200.
2019-06-30 00:16:17,312 f9ca0c6d MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-06-30 00:16:17,337 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-06-30 00:16:17,339 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-30 00:16:17,341 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-f9ca0c6d --format json --quiet --project apache-beam-testing
2019-06-30 00:16:17,955 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-f9ca0c6d --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-f9ca0c6d

2019-06-30 00:16:17,956 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-f9ca0c6d --format json --quiet --project apache-beam-testing
2019-06-30 00:16:18,516 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-f9ca0c6d --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-f9ca0c6d

2019-06-30 00:16:18,519 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-30 00:16:18,520 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-06-30 00:16:18,520 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-06-30 00:16:18,520 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/f9ca0c6d/pkb.log>
2019-06-30 00:16:18,520 f9ca0c6d MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/f9ca0c6d/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_Spark #3341

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/3341/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f22070bc224d75dadb553c7dce4ab5f08120978b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f22070bc224d75dadb553c7dce4ab5f08120978b
Commit message: "Merge pull request #8966: [BEAM-6692] portable Spark: reshuffle translation"
 > git rev-list --no-walk f22070bc224d75dadb553c7dce4ab5f08120978b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2538034738856775147.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6897285731506797872.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6465325686450457071.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4359765106320314914.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3988032566964319348.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5342016603505613655.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins5840951571781806282.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --official=true --dpb_service_zone=fake_zone --benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-06-29 18:15:31,752 9092c998 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9092c998/pkb.log>
2019-06-29 18:15:31,753 9092c998 MainThread INFO     PerfKitBenchmarker version: v1.12.0-1245-gf8515b1
2019-06-29 18:15:31,754 9092c998 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-06-29 18:15:32,027 9092c998 MainThread INFO     Setting --max_concurrent_threads=200.
2019-06-29 18:15:32,050 9092c998 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-06-29 18:15:32,073 9092c998 MainThread dpb_wordcount_benchmark(1/1) INFO     Provisioning resources for benchmark dpb_wordcount_benchmark
2019-06-29 18:15:32,076 9092c998 MainThread dpb_wordcount_benchmark(1/1) ERROR    Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-29 18:15:32,077 9092c998 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-9092c998 --format json --quiet --project apache-beam-testing
2019-06-29 18:15:32,810 9092c998 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters delete pkb-9092c998 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-9092c998

2019-06-29 18:15:32,811 9092c998 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-9092c998 --format json --quiet --project apache-beam-testing
2019-06-29 18:15:33,431 9092c998 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-9092c998 --format json --quiet --project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-9092c998

2019-06-29 18:15:33,433 9092c998 MainThread dpb_wordcount_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 555, in DoProvisionPhase
    spec.Provision()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py",> line 533, in Provision
    self.dpb_service.Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 257, in Create
    self._CreateResource()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 285, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py",> line 208, in _CreateResource
    self._Create()
  File "<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py",> line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-06-29 18:15:33,434 9092c998 MainThread dpb_wordcount_benchmark(1/1) ERROR    Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. Execution will continue.
2019-06-29 18:15:33,434 9092c998 MainThread dpb_wordcount_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-06-29 18:15:33,434 9092c998 MainThread dpb_wordcount_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9092c998/pkb.log>
2019-06-29 18:15:33,434 9092c998 MainThread dpb_wordcount_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/9092c998/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org