You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/02/10 18:09:46 UTC

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #1248

See <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/1248/display/redirect?page=changes>

Changes:

[lukas.drbal] BEAM-6632: fix integer overflow for interval >= 25

[lukas.drbal] BEAM-6632: Use UTC timezone in test

[lukas.drbal] BEAM-6632: added todo comment related to CALCITE-2837

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam11 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 98d7f0db6cf5f1e35176dc06dc20ac9017c7f4f6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 98d7f0db6cf5f1e35176dc06dc20ac9017c7f4f6
Commit message: "Merge pull request #7785: [BEAM-6632] Fixes integer overflow for interval >= 25 DAY"
 > git rev-list --no-walk 7940259137e20d5eb35b4be142ff628e078fe6a1 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3195853978694327007.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format: "default"
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3177285779144801596.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248>
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3129017340854573014.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> create namespace beam-performancetests-avroioit-hdfs-1248
namespace "beam-performancetests-avroioit-hdfs-1248" created
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8181494535965047850.sh
++ kubectl config current-context
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> config set-context gke_apache-beam-testing_us-central1-a_io-datastores --namespace=beam-performancetests-avroioit-hdfs-1248
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7944406480214942232.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker>
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4948441685939911889.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env>
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5923248170681673679.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env>
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/python2>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/python>
Installing setuptools, pkg_resources, pip, wheel...done.
Running virtualenv with interpreter /usr/bin/python2
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6730368312149736412.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (40.8.0)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.0.2)
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6934529132145762605.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6661836351692419734.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/7f/ff/ae64bacdfc95f27a016a7bed8e8686763ba4d277a78ca76f32659220a731/Jinja2-2.10-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (40.8.0)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/bc/3a/6bfd7b4b202fa33bdda8e4e3d3acc719f381fd730f9a0e7c5f34e845bd4d/MarkupSafe-1.1.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/42/a9/7e99652c6bc619d19d58cdd8c47560730eb5825d43a7e25db2e1d776ceb7/xmltodict-0.11.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/17/fd/4c2c8953a9dfe38fbe0c3adafb6355540bd98cef70cc82734acb0a4c0e2f/cryptography-2.5-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8e/5b/4047779fb456b0de503c4acb7b166becf2567efb772abb53998440791d3c/ntlm_auth-1.2.0-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/9f/e0/accfc1b56b57e9750eba272e24c4dddeac86852c2bebd1236674d7887e8a/certifi-2018.11.29-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting urllib3<1.25,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/62/00/ee1d7de624db8ba7090d1226aebefab96a2c71cd5cfa7629d6ad3f61b79e/urllib3-1.24.1-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/dd/3e7a1e1280e7d767bd3fa15791759c91ec19058ebe31217fe66f3e9a8c49/cffi-1.11.5-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, certifi, chardet, idna, urllib3, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.0 PyYAML-3.12 absl-py-0.7.0 asn1crypto-0.24.0 blinker-1.4 certifi-2018.11.29 cffi-1.11.5 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.5 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10 ntlm-auth-1.2.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.21.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.24.1 xmltodict-0.11.0
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2295400709067385931.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.avroioit_hdfs_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --python_binary=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/.beam_env/bin/python> --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src> --official=true --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> --benchmarks=beam_integration_benchmark --beam_it_timeout=1200 --beam_prebuilt=false --beam_sdk=java --beam_it_module=sdks/java/io/file-based-io-tests --beam_it_class=org.apache.beam.sdk.io.avro.AvroIOIT '--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--numberOfRecords=1000000]' '--beam_extra_properties=[filesystem=hdfs]' --beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/pkb-config.yml> --beam_kubernetes_scripts=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
2019-02-10 18:06:51,506 0859cc81 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/0859cc81/pkb.log>
2019-02-10 18:06:51,507 0859cc81 MainThread INFO     PerfKitBenchmarker version: v1.12.0-995-g14d99e2
2019-02-10 18:06:51,508 0859cc81 MainThread INFO     Flag values:
--beam_it_class=org.apache.beam.sdk.io.avro.AvroIOIT
--beam_it_timeout=1200
--beam_it_module=sdks/java/io/file-based-io-tests
--beam_sdk=java
--k8s_get_wait_interval=10
--python_binary=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/.beam_env/bin/python>
--bigquery_table=beam_performance.avroioit_hdfs_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/>
--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--numberOfRecords=1000000]
--nobeam_prebuilt
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248>
--beam_extra_properties=[filesystem=hdfs]
--project=apache-beam-testing
--beam_kubernetes_scripts=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
--official
--dpb_log_level=INFO
--beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/pkb-config.yml>
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src>
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
2019-02-10 18:06:51,835 0859cc81 MainThread INFO     Setting --max_concurrent_threads=200.
2019-02-10 18:06:51,857 0859cc81 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-02-10 18:06:51,877 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-02-10 18:06:51,880 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-02-10 18:06:51,880 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/gradlew> clean assemble --stacktrace --info -p sdks/java/io/file-based-io-tests -DintegrationTestRunner=dataflow -Dfilesystem=hdfs
2019-02-10 18:06:51,989 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/gradlew> clean assemble --stacktrace --info -p sdks/java/io/file-based-io-tests -DintegrationTestRunner=dataflow -Dfilesystem=hdfs}  ReturnCode:1
STDOUT: Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread

STDERR: 
2019-02-10 18:06:51,990 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> create -f <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml>
2019-02-10 18:06:52,591 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-02-10 18:06:52,597 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> get svc hadoop -ojsonpath={.status.loadBalancer.ingress[0].ip}
2019-02-10 18:07:02,805 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> get svc hadoop -ojsonpath={.status.loadBalancer.ingress[0].ip}
2019-02-10 18:07:13,016 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> get svc hadoop -ojsonpath={.status.loadBalancer.ingress[0].ip}
2019-02-10 18:07:23,157 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> get svc hadoop -ojsonpath={.status.loadBalancer.ingress[0].ip}
2019-02-10 18:07:33,307 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> get svc hadoop -ojsonpath={.status.loadBalancer.ingress[0].ip}
2019-02-10 18:07:43,455 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> get svc hadoop -ojsonpath={.status.loadBalancer.ingress[0].ip}
2019-02-10 18:07:53,599 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> get svc hadoop -ojsonpath={.status.loadBalancer.ingress[0].ip}
2019-02-10 18:07:53,734 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Using LoadBalancer IP Address: 104.154.40.123
2019-02-10 18:07:53,734 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> get svc hadoop -ojsonpath={.status.loadBalancer.ingress[0].ip}
2019-02-10 18:07:53,879 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Using LoadBalancer IP Address: 104.154.40.123
2019-02-10 18:07:53,889 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/gradlew> integrationTest --tests=org.apache.beam.sdk.io.avro.AvroIOIT -p sdks/java/io/file-based-io-tests -DintegrationTestRunner=dataflow -Dfilesystem=hdfs -DintegrationTestPipelineOptions=["--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--numberOfRecords=1000000","--hdfsConfiguration=[{\"fs.defaultFS\":\"hdfs://104.154.40.123:9000\",\"dfs.replication\":1}]","--filenamePrefix=hdfs://104.154.40.123:9000/TEXTIO_IT_","--runner=TestDataflowRunner"] --stacktrace --info --scan
2019-02-10 18:07:53,987 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Ran: {<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/gradlew> integrationTest --tests=org.apache.beam.sdk.io.avro.AvroIOIT -p sdks/java/io/file-based-io-tests -DintegrationTestRunner=dataflow -Dfilesystem=hdfs -DintegrationTestPipelineOptions=["--project=apache-beam-testing","--tempRoot=gs://temp-storage-for-perf-tests","--numberOfRecords=1000000","--hdfsConfiguration=[{\"fs.defaultFS\":\"hdfs://104.154.40.123:9000\",\"dfs.replication\":1}]","--filenamePrefix=hdfs://104.154.40.123:9000/TEXTIO_IT_","--runner=TestDataflowRunner"] --stacktrace --info --scan}  ReturnCode:1
STDOUT: Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread

STDERR: 
2019-02-10 18:07:53,988 0859cc81 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 732, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 587, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-02-10 18:07:53,989 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-02-10 18:07:53,989 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1248> delete -f <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> --ignore-not-found
2019-02-10 18:09:45,686 0859cc81 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 872, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 732, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 587, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-02-10 18:09:45,687 0859cc81 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-02-10 18:09:45,688 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-02-10 18:09:45,688 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/0859cc81/pkb.log>
2019-02-10 18:09:45,688 0859cc81 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/0859cc81/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_AvroIOIT_HDFS #1249

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/1249/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org