You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/08/19 10:09:04 UTC

Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #443

See <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/443/display/redirect>

------------------------------------------
GitHub pull request #9315 of commit a634dfd1501b6fc33f99480627d48dc6ff66308f, no merge conflicts.
Setting status of a634dfd1501b6fc33f99480627d48dc6ff66308f to PENDING with url https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/443/ and message: 'Build started for merge commit.'
Using context: Java HadoopFormatIO Performance Test
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-11 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/9315/*:refs/remotes/origin/pr/9315/*
 > git rev-parse refs/remotes/origin/pr/9315/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/9315/merge^{commit} # timeout=10
Checking out Revision 6237e7e92e404ac222521c6a9f65d05edb5b4cc9 (refs/remotes/origin/pr/9315/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 6237e7e92e404ac222521c6a9f65d05edb5b4cc9
Commit message: "Merge a634dfd1501b6fc33f99480627d48dc6ff66308f into 3ce0674207c8bb4fc4cded31d4e5c48d5149318a"
First time build. Skipping changelog.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins3141287539649582347.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format: "default"
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins7842756926593799668.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-443>
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins1051976389028926405.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-443> create namespace beam-performancetests-hadoopformat-443
namespace/beam-performancetests-hadoopformat-443 created
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins7076775356761459442.sh
++ kubectl config current-context
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-443> config set-context gke_apache-beam-testing_us-central1-a_io-datastores --namespace=beam-performancetests-hadoopformat-443
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins3137864772896893203.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker>
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins7969342558776284158.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/env/.perfkit_env>
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins4504498757800858149.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/env/.perfkit_env> --python=python2.7
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins5063617784811306761.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (41.1.0)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python2.7/site-packages (19.2.2)
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins2806167951609165230.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins4085572696815917603.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 14))
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python2.7/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.1.0)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 18))
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/d8/a6/f46ae3f1da0cd4361c344888f59ec2f5785e69c872e175a748ef6071cdb5/futures-3.3.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 20))
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 23))
Collecting contextlib2>=0.5.1 (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 24))
  Using cached https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 26))
Collecting enum34; python_version < "3.4" (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting six (from absl-py->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/68/50698ce24c61db7d44d93a5043c621a0ca7839d4ef9dff913e6ab465fc92/cryptography-2.7-cp27-cp27mu-manylinux1_x86_64.whl
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/50/09/5e397eb18685b14fd8b209e26cdb4fa6451c82c1bcc651fef05fa73e7b27/ntlm_auth-1.4.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/e6/60/247f23a7121ae632d62811ba7f273d0e58972d75e58a94d329d51550a47d/urllib3-1.25.3-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/69/1b/b853c7a9d4f6a6d00749e94eb6f3a041e342a885b87340b79c1ef73e3a78/certifi-2019.6.16-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
Collecting asn1crypto>=0.21.0 (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
Collecting ipaddress; python_version < "3" (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
  Using cached https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/requirements.txt> (line 25))
Installing collected packages: enum34, six, absl-py, MarkupSafe, jinja2, colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 asn1crypto-0.24.0 blinker-1.4 certifi-2019.6.16 cffi-1.12.3 chardet-3.0.4 colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.7 enum34-1.1.6 funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.3.0 idna-2.8 ipaddress-1.0.22 jinja2-2.10.1 ntlm-auth-1.4.0 numpy-1.13.3 pint-0.9 pycparser-2.19 pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 timeout-decorator-0.4.1 urllib3-1.25.3 xmltodict-0.12.0
[beam_PerformanceTests_HadoopFormat] $ /bin/bash -xe /tmp/jenkins2395556018219206514.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/env/.perfkit_env/bin/python> <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/pkb.py> --project=apache-beam-testing --dpb_log_level=INFO --bigquery_table=beam_performance.hadoopformatioit_pkb_results --k8s_get_retry_count=36 --k8s_get_wait_interval=10 --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/> --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src> --official=true --dpb_service_zone=fake_zone --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-443> --beam_it_timeout=1200 --benchmarks=beam_integration_benchmark --beam_prebuilt=false --beam_sdk=java --beam_it_module=:sdks:java:io:hadoop-format --beam_it_class=org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT '--beam_it_options=[--tempRoot=gs://temp-storage-for-perf-tests,--project=apache-beam-testing,--postgresPort=5432,--numberOfRecords=600000,--bigQueryDataset=beam_performance,--bigQueryTable=hadoopformatioit_results]' --beam_kubernetes_scripts=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml> --beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/pkb-config-local.yml>
2019-08-19 10:08:40,145 af2b9bae MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/af2b9bae/pkb.log>
2019-08-19 10:08:40,146 af2b9bae MainThread INFO     PerfKitBenchmarker version: v1.12.0-1335-gf7f5bc4
2019-08-19 10:08:40,147 af2b9bae MainThread INFO     Flag values:
--beam_it_class=org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT
--beam_it_timeout=1200
--beam_it_module=:sdks:java:io:hadoop-format
--beam_sdk=java
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.hadoopformatioit_pkb_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/>
--beam_it_options=[--tempRoot=gs://temp-storage-for-perf-tests,--project=apache-beam-testing,--postgresPort=5432,--numberOfRecords=600000,--bigQueryDataset=beam_performance,--bigQueryTable=hadoopformatioit_results]
--nobeam_prebuilt
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-443>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--beam_kubernetes_scripts=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml>
--official
--dpb_log_level=INFO
--beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/pkb-config-local.yml>
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src>
--k8s_get_retry_count=36
--benchmarks=beam_integration_benchmark
2019-08-19 10:08:40,387 af2b9bae MainThread INFO     Setting --max_concurrent_threads=200.
2019-08-19 10:08:40,410 af2b9bae MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2019-08-19 10:08:40,433 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2019-08-19 10:08:40,436 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2019-08-19 10:08:40,436 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Running: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/gradlew> --stacktrace --info :sdks:java:io:hadoop-format:clean :sdks:java:io:hadoop-format:assemble -DintegrationTestRunner=dataflow
2019-08-19 10:09:02,747 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-443> create -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml>
2019-08-19 10:09:03,012 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2019-08-19 10:09:03,013 af2b9bae MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 837, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 686, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 144, in Run
    beam_pipeline_options.ReadPipelineOptionConfigFile()
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 121, in ReadPipelineOptionConfigFile
    with open(FLAGS.beam_options_config_file, 'r') as fileStream:
IOError: [Errno 2] No such file or directory: '<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/pkb-config-local.yml'>
2019-08-19 10:09:03,014 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-08-19 10:09:03,014 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-443> delete -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml> --ignore-not-found
2019-08-19 10:09:03,284 af2b9bae MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 980, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 837, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 686, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 144, in Run
    beam_pipeline_options.ReadPipelineOptionConfigFile()
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 121, in ReadPipelineOptionConfigFile
    with open(FLAGS.beam_options_config_file, 'r') as fileStream:
IOError: [Errno 2] No such file or directory: '<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/pkb-config-local.yml'>
2019-08-19 10:09:03,285 af2b9bae MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-08-19 10:09:03,285 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-08-19 10:09:03,285 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/af2b9bae/pkb.log>
2019-08-19 10:09:03,285 af2b9bae MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/af2b9bae/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_HadoopFormat #445

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/445/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #444

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/444/display/redirect>

------------------------------------------
[...truncated 226.76 KB...]
:sdks:java:harness:shadowJar (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 7.557 secs.
:runners:java-fn-execution:compileJava (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:java-fn-execution:compileJava FROM-CACHE
Build cache key for task ':runners:java-fn-execution:compileJava' is a880df7b0f04d5085b1df4d513cae1cf
Task ':runners:java-fn-execution:compileJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@2a84ded2: {executionTime=14988, hostName=apache-beam-jenkins-14, operatingSystem=Linux, buildInvocationId=kpyqa7sjhjfcppedbkhq6mktpm, creationTime=1565987593235, identity=:runners:java-fn-execution:compileJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Go_VR_Flink/src}
Unpacked trees for task ':runners:java-fn-execution:compileJava' from cache.
:runners:java-fn-execution:compileJava (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 2.348 secs.
:runners:java-fn-execution:classes (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:java-fn-execution:classes UP-TO-DATE
Skipping task ':runners:java-fn-execution:classes' as it has no actions.
:runners:java-fn-execution:classes (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:runners:java-fn-execution:jar (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:java-fn-execution:jar
Build cache key for task ':runners:java-fn-execution:jar' is 597e088740b6befe42ec8a0c9a7eb709
Caching disabled for task ':runners:java-fn-execution:jar': Caching has not been enabled for the task
Task ':runners:java-fn-execution:jar' is not up-to-date because:
  No history is available.
:runners:java-fn-execution:jar (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.039 secs.
:runners:direct-java:compileJava (Thread[Execution worker for ':' Thread 11,5,main]) started.
:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Execution worker for ':' Thread 5,5,main]) started.

> Task :runners:direct-java:compileJava FROM-CACHE
Build cache key for task ':runners:direct-java:compileJava' is 2de433c0fdc60ffeddc25ffbc771f212
Task ':runners:direct-java:compileJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@6aada733: {executionTime=9398, hostName=apache-beam-jenkins-14, operatingSystem=Linux, buildInvocationId=fol4adxgt5e4hflvqitxuaipui, creationTime=1565987856024, identity=:runners:direct-java:compileJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Java_ValidatesRunner_Direct/src}
Unpacked trees for task ':runners:direct-java:compileJava' from cache.
:runners:direct-java:compileJava (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.068 secs.
:runners:direct-java:classes (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:direct-java:classes UP-TO-DATE
Skipping task ':runners:direct-java:classes' as it has no actions.
:runners:direct-java:classes (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:runners:direct-java:jar (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:direct-java:jar
Build cache key for task ':runners:direct-java:jar' is be7b81d1d1357e0c0c57c3bb6505856c
Caching disabled for task ':runners:direct-java:jar': Caching has not been enabled for the task
Task ':runners:direct-java:jar' is not up-to-date because:
  No history is available.
:runners:direct-java:jar (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.053 secs.
:runners:direct-java:shadowJar (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava FROM-CACHE
file or directory '<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/src/main/java',> not found
Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is d9706e1e4d7a52c37d4048e51bdda50d
Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@4b244e0a: {executionTime=18601, hostName=apache-beam-jenkins-14, operatingSystem=Linux, buildInvocationId=4t5i2go255coveyq2oco5kx4xm, creationTime=1566000702957, identity=:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PreCommit_Java_Examples_Dataflow_Cron/src}
Unpacked trees for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava' from cache.
:runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 0.303 secs.
:runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Execution worker for ':' Thread 5,5,main]) started.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:worker:legacy-worker:classes' as it has no actions.
:runners:google-cloud-dataflow-java:worker:legacy-worker:classes (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 5,5,main]) started.

> Task :runners:direct-java:shadowJar
Build cache key for task ':runners:direct-java:shadowJar' is 5ebdc1267b85101c7f62f2dc30aa77b8
Caching disabled for task ':runners:direct-java:shadowJar': Caching has not been enabled for the task
Task ':runners:direct-java:shadowJar' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':runners:direct-java:shadowJar'.
*******************
GRADLE SHADOW STATS

Total Jars: 6 (includes project)
Total Time: 0.622s [622ms]
Average Time/Jar: 0.10366666666669999s [103.6666666667ms]
*******************
:runners:direct-java:shadowJar (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.854 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is 2fe5fb6ed28bb4f31a5b7fa9ff5315c7
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@2b700ec7: {executionTime=12344, hostName=apache-beam-jenkins-14, operatingSystem=Linux, buildInvocationId=fjv7e26gojcfzblsljt3p6hfxq, creationTime=1566000511504, identity=:sdks:java:io:google-cloud-platform:compileTestJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_SQL/src}
Unpacked trees for task ':sdks:java:io:google-cloud-platform:compileTestJava' from cache.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.25 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses UP-TO-DATE
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Build cache key for task ':sdks:java:io:google-cloud-platform:testJar' is b4f734c6a489e06c91560dca093d7f40
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar': Caching has not been enabled for the task
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.049 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 290c7f119e5ffb3339b634121adf5a0e
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because:
  No history is available.
Origin for org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution@6c7051d9: {executionTime=8106, hostName=apache-beam-jenkins-14, operatingSystem=Linux, buildInvocationId=6g7tszywojdgfclmz2ejdogvz4, creationTime=1566002886040, identity=:runners:google-cloud-dataflow-java:compileTestJava, type=org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.TaskExecution, userName=jenkins, gradleVersion=5.2.1, rootPath=/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_XmlIOIT_HDFS/src}
Unpacked trees for task ':runners:google-cloud-dataflow-java:compileTestJava' from cache.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.146 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Build cache key for task ':runners:google-cloud-dataflow-java:testJar' is 2b3d4a2678d51e113fb24f4e3107fea8
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar': Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution worker for ':' Thread 11,5,main]) completed. Took 0.025 secs.

> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar
Build cache key for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is a73b560d658ac15979b949682b9d2a17
Caching disabled for task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar': Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar' is not up-to-date because:
  No history is available.
Custom actions are attached to task ':runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar'.
*******************
GRADLE SHADOW STATS

Total Jars: 16 (includes project)
Total Time: 3.401s [3401ms]
Average Time/Jar: 0.2125625s [212.5625ms]
*******************
:runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 4.335 secs.
:sdks:java:io:hadoop-format:compileTestJava (Thread[Execution worker for ':' Thread 5,5,main]) started.

> Task :sdks:java:io:hadoop-format:compileTestJava
Build cache key for task ':sdks:java:io:hadoop-format:compileTestJava' is a4ba9fafd5b0f0303309c76ed421f7a4
Task ':sdks:java:io:hadoop-format:compileTestJava' is not up-to-date because:
  No history is available.
All input files are considered out-of-date for incremental task ':sdks:java:io:hadoop-format:compileTestJava'.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
Compiling with error-prone compiler
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Created classpath snapshot for incremental compilation in 1.881 secs. 4652 duplicate classes found in classpath (see all with --debug).
Packing task ':sdks:java:io:hadoop-format:compileTestJava'
:sdks:java:io:hadoop-format:compileTestJava (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 9.53 secs.
:sdks:java:io:hadoop-format:testClasses (Thread[Execution worker for ':' Thread 5,5,main]) started.

> Task :sdks:java:io:hadoop-format:testClasses
Skipping task ':sdks:java:io:hadoop-format:testClasses' as it has no actions.
:sdks:java:io:hadoop-format:testClasses (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution worker for ':' Thread 6,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest
Build cache key for task ':sdks:java:io:hadoop-format:integrationTest' is bf38bd7083169b6c14f6e54dc44358c1
Task ':sdks:java:io:hadoop-format:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Custom actions are attached to task ':sdks:java:io:hadoop-format:integrationTest'.
Starting process 'Gradle Test Executor 1'. Working directory: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--numberOfRecords=600000","--bigQueryDataset=beam_performance","--bigQueryTable=hadoopformatioit_results","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresServerName=104.198.223.85","--postgresSsl=false","--postgresPort=5432","--workerHarnessContainerImage=","--dataflowWorkerJar=${dataflowWorkerJar}"] -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/5.2.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.16.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.1.3/d90276fff414f06cb375f2057f6778cd63c6082f/logback-classic-1.1.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.7.25/110cefe2df103412849d72ef7a67e4e91e4266b4/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat STANDARD_ERROR
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.util.NativeCodeLoader <clinit>
    WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter <init>
    INFO: File Output Committer Algorithm version is 1
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
    Aug 19, 2019 10:39:39 AM org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter setupJob
    WARNING: Output Path is null in setupJob()
    Aug 19, 2019 10:39:39 AM org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$SetupJobFn trySetupJob
    INFO: Job with id 1 successfully configured from window with max timestamp 294247-01-09T04:00:54.775Z.
Process leaked file descriptors. See https://jenkins.io/redirect/troubleshooting/process-leaked-file-descriptors for more information
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat SKIPPED

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org