You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/01/18 13:13:00 UTC

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #1

See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/1/display/redirect>

------------------------------------------
GitHub pull request #4401 of commit eeaf07975e27bdc829c7a2215f6d26f136abcbb9, no merge conflicts.
Setting status of eeaf07975e27bdc829c7a2215f6d26f136abcbb9 to PENDING with url https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/1/ and message: 'Build started sha1 is merged.'
Using context: Jenkins: Java TextIO Performance Test on HDFS
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/>
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/4401/*:refs/remotes/origin/pr/4401/*
 > git rev-parse refs/remotes/origin/pr/4401/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/4401/merge^{commit} # timeout=10
Checking out Revision e27d992229189b00afd0f931e9cfb8cd5d6085ea (refs/remotes/origin/pr/4401/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e27d992229189b00afd0f931e9cfb8cd5d6085ea
Commit message: "Merge eeaf07975e27bdc829c7a2215f6d26f136abcbb9 into 2f235dd58acce27f713c7072d62cd3a72e3413a1"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8948178766946544449.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1532808344824072923.sh
+ rm -rf .env
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4757675502495328233.sh
+ virtualenv .env --system-site-packages
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.env/bin/python>
Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1677499170345553716.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3792081257058949260.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in /usr/local/lib/python2.7/dist-packages (from -r PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied: numpy in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied: functools32 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in /home/jenkins/.local/lib/python2.7/site-packages (from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: MarkupSafe in /usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: colorama; extra == "windows" in /usr/lib/python2.7/dist-packages (from colorlog[windows]==2.6.0->-r PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: xmltodict in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests-ntlm>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests>=2.9.1 in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: ntlm-auth>=1.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: cryptography>=1.3 in /home/jenkins/.local/lib/python2.7/site-packages (from requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: idna<2.6,>=2.5 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: urllib3<1.22,>=1.21.1 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: certifi>=2017.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: cffi>=1.7; platform_python_implementation != "PyPy" in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: enum34; python_version < "3" in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: asn1crypto>=0.21.0 in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: ipaddress; python_version < "3" in /usr/local/lib/python2.7/dist-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: pycparser in /home/jenkins/.local/lib/python2.7/site-packages (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1865062812066033055.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/sdks/python>
Requirement already satisfied: avro<2.0.0,>=1.8.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: crcmod<2.0,>=1.7 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: dill==0.2.6 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: grpcio<2,>=1.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: httplib2<0.10,>=0.8 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: oauth2client<5,>=2.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: pyyaml<4.0.0,>=3.12 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: six<1.12,>=1.9 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: futures<4.0.0,>=3.1.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: hdfs3<0.4.0,>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: google-apitools<=0.5.20,>=0.5.18 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: googledatastore==7.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: google-cloud-pubsub==0.26.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: google-cloud-bigquery==0.25.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: enum34>=1.0.4 in /home/jenkins/.local/lib/python2.7/site-packages (from grpcio<2,>=1.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: funcsigs>=1; python_version < "3.3" in /home/jenkins/.local/lib/python2.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: pbr>=0.11 in /home/jenkins/.local/lib/python2.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: pyasn1>=0.1.7 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: pyasn1-modules>=0.0.5 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: rsa>=3.1.4 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages (from protobuf<4,>=3.5.0.post1->apache-beam==2.3.0.dev0)
Requirement already satisfied: fasteners>=0.14 in /home/jenkins/.local/lib/python2.7/site-packages (from google-apitools<=0.5.20,>=0.5.18->apache-beam==2.3.0.dev0)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in /home/jenkins/.local/lib/python2.7/site-packages (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: google-cloud-core<0.26dev,>=0.25.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: monotonic>=0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from fasteners>=0.14->google-apitools<=0.5.20,>=0.5.18->apache-beam==2.3.0.dev0)
Requirement already satisfied: google-auth<2.0.0dev,>=0.4.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: google-auth-httplib2 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: google-gax<0.16dev,>=0.15.7 in /home/jenkins/.local/lib/python2.7/site-packages (from gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: proto-google-cloud-pubsub-v1[grpc]<0.16dev,>=0.15.4 in /home/jenkins/.local/lib/python2.7/site-packages (from gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: grpc-google-iam-v1<0.12dev,>=0.11.1 in /home/jenkins/.local/lib/python2.7/site-packages (from gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: cachetools>=2.0.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-auth<2.0.0dev,>=0.4.0->google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: future<0.17dev,>=0.16.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: ply==3.8 in /home/jenkins/.local/lib/python2.7/site-packages (from google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: requests<3.0dev,>=2.13.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: idna<2.6,>=2.5 in /home/jenkins/.local/lib/python2.7/site-packages (from requests<3.0dev,>=2.13.0->google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: urllib3<1.22,>=1.21.1 in /home/jenkins/.local/lib/python2.7/site-packages (from requests<3.0dev,>=2.13.0->google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests<3.0dev,>=2.13.0->google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: certifi>=2017.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from requests<3.0dev,>=2.13.0->google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.3.0.dev0
    Not uninstalling apache-beam at /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_TextIOIT/src/sdks/python, outside environment <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.env>
  Running setup.py develop for apache-beam
Successfully installed apache-beam
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins660171517124400448.sh
+ .env/bin/python PerfKitBenchmarker/pkb.py --project=apache-beam-testing --dpb_log_level=INFO --maven_binary=/home/jenkins/tools/maven/latest/bin/mvn --bigquery_table=beam_performance.textioit_hdfs_pkb_results --official=true --benchmarks=beam_integration_benchmark --beam_it_timeout=1200 --beam_it_profile=io-it --beam_prebuilt=true --beam_sdk=java --beam_it_module=sdks/java/io/file-based-io-tests --beam_it_class=org.apache.beam.sdk.io.text.TextIOIT '--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--numberOfRecords=1000000]' '--beam_extra_mvn_properties=[filesystem=hdfs]' --beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/pkb-config.yml> --beam_kubernetes_scripts=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml,/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_TextIOIT_HDFS/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
Traceback (most recent call last):
  File "PerfKitBenchmarker/pkb.py", line 21, in <module>
    sys.exit(Main())
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 869, in Main
    SetUpPKB()
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 740, in SetUpPKB
    vm_util.GenTempDir()
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 121, in GenTempDir
    temp_dir.CreateTemporaryDirectories()
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/temp_dir.py",> line 62, in CreateTemporaryDirectories
    os.makedirs(path)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.env/lib/python2.7/os.py",> line 157, in makedirs
    mkdir(name, mode)
OSError: [Errno 13] Permission denied: '/tmp/perfkitbenchmarker/runs/5c5ef9c9'
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PerformanceTests_TextIOIT_HDFS #8

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/8/display/redirect>


Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #7

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/7/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam5 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5b6ca47fec0b5b720ad5afb9274dc0d418545b43 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5b6ca47fec0b5b720ad5afb9274dc0d418545b43
Commit message: "[BEAM-2281][Sql] Use SqlFunctions.toBigDecimal not toString (#4865)"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3556736611502762214.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5780875209998727672.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521114081585>
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3234481594524219174.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521114081585> create namespace filebasedioithdfs-1521114081585
Error from server (AlreadyExists): namespaces "filebasedioithdfs-1521114081585" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #6

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/6/display/redirect>

------------------------------------------
[...truncated 54.10 KB...]
[INFO] Excluding org.threeten:threetenbp:jar:1.3.3 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core-grpc:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java-util:jar:3.2.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-pubsub:jar:v1-rev10-1.22.0 from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-pubsub-v1:jar:0.1.18 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-pubsub-v1:jar:0.1.18 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-iam-v1:jar:0.1.18 from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-proto-client:jar:1.4.0 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-protobuf:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.cloud.datastore:datastore-v1-protos:jar:1.3.0 from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-common-protos:jar:0.1.9 from the shaded jar.
[INFO] Excluding io.grpc:grpc-auth:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-netty:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http2:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-http:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler-proxy:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec-socks:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-resolver:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.1.8.Final from the shaded jar.
[INFO] Excluding io.grpc:grpc-stub:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-all:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-okhttp:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.squareup.okhttp:okhttp:jar:2.5.0 from the shaded jar.
[INFO] Excluding com.squareup.okio:okio:jar:1.6.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-lite:jar:1.2.0 from the shaded jar.
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.protobuf.nano:protobuf-javanano:jar:3.0.0-alpha-5 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.opencensus:opencensus-api:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.dropwizard.metrics:metrics-core:jar:3.1.2 from the shaded jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.0.1 from the shaded jar.
[INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.0.1 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 from the shaded jar.
[INFO] Excluding org.apache.beam:beam-sdks-java-io-hadoop-file-system:jar:2.5.0-SNAPSHOT from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-hdfs:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar.
[INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-core:jar:1.9 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-server:jar:1.9 from the shaded jar.
[INFO] Excluding asm:asm:jar:3.1 from the shaded jar.
[INFO] Excluding commons-cli:commons-cli:jar:1.2 from the shaded jar.
[INFO] Excluding commons-codec:commons-codec:jar:1.4 from the shaded jar.
[INFO] Excluding commons-io:commons-io:jar:2.4 from the shaded jar.
[INFO] Excluding commons-lang:commons-lang:jar:2.6 from the shaded jar.
[INFO] Excluding commons-logging:commons-logging:jar:1.1.3 from the shaded jar.
[INFO] Excluding commons-daemon:commons-daemon:jar:1.0.13 from the shaded jar.
[INFO] Excluding log4j:log4j:jar:1.2.17 from the shaded jar.
[INFO] Excluding com.google.protobuf:protobuf-java:jar:3.2.0 from the shaded jar.
[INFO] Excluding javax.servlet:servlet-api:jar:2.5 from the shaded jar.
[INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar.
[INFO] Excluding io.netty:netty-all:jar:4.0.23.Final from the shaded jar.
[INFO] Excluding xerces:xercesImpl:jar:2.9.1 from the shaded jar.
[INFO] Excluding xml-apis:xml-apis:jar:1.3.04 from the shaded jar.
[INFO] Excluding org.apache.htrace:htrace-core:jar:3.1.0-incubating from the shaded jar.
[INFO] Excluding org.fusesource.leveldbjni:leveldbjni-all:jar:1.8 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-client:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-common:jar:2.7.3 from the shaded jar.
[INFO] Excluding org.apache.commons:commons-math3:jar:3.1.1 from the shaded jar.
[INFO] Excluding commons-httpclient:commons-httpclient:jar:3.1 from the shaded jar.
[INFO] Excluding commons-net:commons-net:jar:3.1 from the shaded jar.
[INFO] Excluding commons-collections:commons-collections:jar:3.2.2 from the shaded jar.
[INFO] Excluding javax.servlet.jsp:jsp-api:jar:2.1 from the shaded jar.
[INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from the shaded jar.
[INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded jar.
[INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the shaded jar.
[INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from the shaded jar.
[INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.7.10 from the shaded jar.
[INFO] Excluding com.google.code.gson:gson:jar:2.2.4 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-auth:jar:2.7.3 from the shaded jar.
[INFO] Excluding org.apache.directory.server:apacheds-kerberos-codec:jar:2.0.0-M15 from the shaded jar.
[INFO] Excluding org.apache.directory.server:apacheds-i18n:jar:2.0.0-M15 from the shaded jar.
[INFO] Excluding org.apache.directory.api:api-asn1-api:jar:1.0.0-M20 from the shaded jar.
[INFO] Excluding org.apache.directory.api:api-util:jar:1.0.0-M20 from the shaded jar.
[INFO] Excluding org.apache.curator:curator-framework:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.curator:curator-client:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.curator:curator-recipes:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.zookeeper:zookeeper:jar:3.4.6 from the shaded jar.
[INFO] Excluding io.netty:netty:jar:3.7.0.Final from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-yarn-client:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-yarn-server-common:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-yarn-api:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.7.3 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-yarn-common:jar:2.7.3 from the shaded jar.
[INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.2 from the shaded jar.
[INFO] Excluding javax.xml.stream:stax-api:jar:1.0-2 from the shaded jar.
[INFO] Excluding javax.activation:activation:jar:1.1 from the shaded jar.
[INFO] Excluding com.sun.jersey:jersey-client:jar:1.9 from the shaded jar.
[INFO] Excluding org.codehaus.jackson:jackson-jaxrs:jar:1.9.13 from the shaded jar.
[INFO] Excluding org.codehaus.jackson:jackson-xc:jar:1.9.13 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.7.1 from the shaded jar.
[INFO] Excluding org.apache.hadoop:hadoop-annotations:jar:2.7.1 from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/d55f2a78/beam/sdks/java/io/file-based-io-tests/target/beam-sdks-java-io-file-based-io-tests-2.5.0-SNAPSHOT.jar> with <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/d55f2a78/beam/sdks/java/io/file-based-io-tests/target/beam-sdks-java-io-file-based-io-tests-2.5.0-SNAPSHOT-shaded.jar>
[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/d55f2a78/beam/sdks/java/io/file-based-io-tests/target/beam-sdks-java-io-file-based-io-tests-2.5.0-SNAPSHOT-tests.jar> with <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/d55f2a78/beam/sdks/java/io/file-based-io-tests/target/beam-sdks-java-io-file-based-io-tests-2.5.0-SNAPSHOT-shaded-tests.jar>
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20.1:integration-test (default) @ beam-sdks-java-io-file-based-io-tests ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/d55f2a78/beam/sdks/java/io/file-based-io-tests/target/failsafe-reports>
[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, threadCountMethods=0, parallelOptimized=true
[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.beam.sdk.io.text.TextIOIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 287.798 s - in org.apache.beam.sdk.io.text.TextIOIT
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-file-based-io-tests ---
[WARNING] Used undeclared dependencies found:
[WARNING]    javax.xml.bind:jaxb-api:jar:2.2.2:runtime
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:11 min
[INFO] Finished at: 2018-03-15T11:50:15Z
[INFO] Final Memory: 98M/1218M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:3.0.2:analyze-only (default) on project beam-sdks-java-io-file-based-io-tests: Dependency problems found -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:3.0.2:analyze-only (default) on project beam-sdks-java-io-file-based-io-tests: Dependency problems found
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: Dependency problems found
    at org.apache.maven.plugins.dependency.analyze.AbstractAnalyzeMojo.execute (AbstractAnalyzeMojo.java:254)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

STDERR: 
2018-03-15 11:50:15,735 d55f2a78 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-15 11:50:15,736 d55f2a78 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-03-15 11:50:15,737 d55f2a78 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521114081585> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-15 11:50:16,350 d55f2a78 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521114081585> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-03-15 11:50:16,526 d55f2a78 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 733, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-15 11:50:16,526 d55f2a78 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-03-15 11:50:16,527 d55f2a78 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-03-15 11:50:16,527 d55f2a78 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/d55f2a78/pkb.log>
2018-03-15 11:50:16,527 d55f2a78 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/d55f2a78/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #5

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/5/display/redirect>

------------------------------------------
[...truncated 215.46 KB...]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy61.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:248)
	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:235)
	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$DoFnInvoker.invokeProcessElement(Unknown Source)
	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
	at com.google.cloud.dataflow.worker.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:118)
	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeterministicallyConstructTestTextLineFn.processElement(FileBasedIOITHelper.java:84)
	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeterministicallyConstructTestTextLineFn$DoFnInvoker.invokeProcessElement(Unknown Source)
	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:189)
	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:150)
	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:74)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
(8781d0eb0c49bf6c): org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create file/.temp-beam-2018-03-14_16-44-48-0/91c84733-c312-4d0e-ae25-5bd38ee5ad9b. Name node is in safe mode.
The reported blocks 31 has reached the threshold 0.9990 of total blocks 31. The number of live datanodes 1 has reached the minimum number 0. In safe mode extension. Safe mode will be turned off automatically in 28 seconds.
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1327)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2447)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2335)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:623)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:397)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

	at org.apache.hadoop.ipc.Client.call(Client.java:1475)
	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
	at com.sun.proxy.$Proxy60.create(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy61.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1623)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1703)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1638)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:248)
	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:235)
	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
(5cc075156366976b): Workflow failed. Causes: (5cc0751563669a40): S02:Generate sequence/Read(BoundedCountingSource)+Produce text lines+Write content to files/WriteFiles/RewindowIntoGlobal/Window.Assign+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write content to files/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write failed., (cea43ae8f4de6e2d): A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  textioit0writethenreadall-03140944-43be-harness-9cm6,
  textioit0writethenreadall-03140944-43be-harness-9cm6,
  textioit0writethenreadall-03140944-43be-harness-9cm6,
  textioit0writethenreadall-03140944-43be-harness-9cm6
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
	at org.apache.beam.sdk.io.text.TextIOIT.writeThenReadAll(TextIOIT.java:114)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   TextIOIT.writeThenReadAll:114  Runtime (8781d0eb0c49b597): java.net.ConnectEx...
[INFO] 
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-file-based-io-tests ---
[WARNING] Used undeclared dependencies found:
[WARNING]    javax.xml.bind:jaxb-api:jar:2.2.2:runtime
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:12 min
[INFO] Finished at: 2018-03-14T16:49:37Z
[INFO] Final Memory: 102M/1337M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:3.0.2:analyze-only (default) on project beam-sdks-java-io-file-based-io-tests: Dependency problems found -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-dependency-plugin:3.0.2:analyze-only (default) on project beam-sdks-java-io-file-based-io-tests: Dependency problems found
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: Dependency problems found
    at org.apache.maven.plugins.dependency.analyze.AbstractAnalyzeMojo.execute (AbstractAnalyzeMojo.java:254)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

STDERR: 
2018-03-14 16:49:38,221 6837e19e MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-14 16:49:38,223 6837e19e MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-03-14 16:49:38,223 6837e19e MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521044467740> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 16:49:39,051 6837e19e MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521044467740> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-03-14 16:49:39,234 6837e19e MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 733, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-14 16:49:39,234 6837e19e MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-03-14 16:49:39,234 6837e19e MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-03-14 16:49:39,235 6837e19e MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/6837e19e/pkb.log>
2018-03-14 16:49:39,235 6837e19e MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/6837e19e/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #4

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/4/display/redirect>

------------------------------------------
[...truncated 113.93 KB...]

2018-03-14 14:29:11,220 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:29:26,707 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:29:26,809 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:29:26,810 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:29:49,521 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:29:49,622 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:29:49,623 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:30:13,933 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:30:14,043 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:30:14,043 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:30:43,071 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:30:43,169 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:30:43,169 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:31:01,951 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:31:02,065 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:31:02,066 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:31:22,275 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:31:22,373 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:31:22,373 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:31:42,837 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:31:42,946 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:31:42,946 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:32:00,904 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:32:01,023 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:32:01,023 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:32:21,366 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:32:21,471 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:32:21,471 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:32:49,432 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:32:49,549 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:32:49,549 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:33:13,306 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:33:13,416 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:33:13,416 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:33:41,633 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:33:41,742 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:33:41,743 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:33:59,219 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:33:59,333 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:33:59,333 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:34:15,455 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:34:15,559 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:34:15,559 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:34:42,357 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:34:42,471 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:34:42,472 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:35:10,047 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:35:10,150 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:35:10,151 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:35:27,452 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:35:27,559 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:35:27,559 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:35:52,226 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:35:52,343 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:35:52,343 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:36:21,582 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:36:21,687 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:36:21,687 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:36:40,215 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:36:40,337 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:36:40,337 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:37:05,396 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:37:05,503 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:37:05,503 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:37:25,715 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:37:25,822 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:37:25,823 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:37:53,911 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:37:54,023 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:37:54,023 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:38:10,748 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:38:10,913 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:38:10,914 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:38:38,866 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:38:38,973 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:38:38,974 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:39:02,391 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:39:02,499 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:39:02,499 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:39:24,219 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:39:24,322 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:39:24,322 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:39:41,869 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:39:41,981 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:39:41,982 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:40:03,698 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:40:03,820 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:40:03,820 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:40:20,810 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:40:20,964 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:40:20,964 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 14:40:43,679 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 14:40:43,791 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1521034962793> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: the path "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml"> does not exist

2018-03-14 14:40:43,793 b7d510b6 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 733, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 666, in RunBenchmark
    DoCleanupPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 569, in DoCleanupPhase
    spec.BenchmarkCleanup(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 167, in Cleanup
    kubernetes_helper.DeleteAllFiles(getKubernetesScripts())
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/kubernetes_helper.py",> line 49, in DeleteAllFiles
    DeleteFromFile(file)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/kubernetes_helper.py",> line 44, in DeleteFromFile
    vm_util.IssueRetryableCommand(delete_cmd)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 249, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 387, in IssueRetryableCommand
    'Command returned a non-zero exit code.\n')
CalledProcessException: Command returned a non-zero exit code.

2018-03-14 14:40:43,794 b7d510b6 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-03-14 14:40:43,794 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-03-14 14:40:43,794 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/b7d510b6/pkb.log>
2018-03-14 14:40:43,794 b7d510b6 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/b7d510b6/completion_statuses.json>
Build step 'Execute shell' marked build as failure


Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #3

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/3/display/redirect>

------------------------------------------
[...truncated 101.79 KB...]

2018-03-14 12:44:39,961 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:45:05,577 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:45:05,673 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:45:05,673 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:45:26,786 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:45:26,881 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:45:26,881 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:45:55,145 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:45:55,279 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:45:55,279 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:46:13,375 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:46:13,471 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:46:13,471 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:46:32,304 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:46:32,413 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:46:32,413 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:46:57,540 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:46:57,649 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:46:57,649 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:47:16,426 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:47:16,524 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:47:16,525 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:47:42,391 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:47:42,491 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:47:42,491 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:48:06,247 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:48:06,344 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:48:06,345 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:48:30,651 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:48:30,770 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:48:30,770 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:48:54,765 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:48:54,862 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:48:54,863 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:49:12,084 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:49:12,177 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:49:12,178 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:49:31,960 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:49:32,061 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:49:32,061 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:49:52,948 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:49:53,052 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:49:53,053 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:50:22,405 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:50:22,503 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:50:22,504 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:50:47,969 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:50:48,069 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:50:48,070 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:51:07,960 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:51:08,058 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:51:08,058 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:51:32,822 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:51:32,928 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:51:32,928 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:52:00,475 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:52:00,577 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:52:00,578 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:52:18,924 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:52:19,022 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:52:19,022 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:52:44,405 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:52:44,508 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:52:44,508 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:53:02,587 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:53:02,722 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:53:02,722 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:53:22,176 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:53:22,269 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:53:22,269 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:53:48,413 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:53:48,513 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:53:48,513 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:54:15,560 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:54:15,655 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:54:15,655 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:54:41,586 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:54:41,692 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:54:41,692 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:55:08,911 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:55:09,006 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:55:09,006 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:55:26,455 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:55:26,550 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:55:26,550 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:55:53,580 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:55:53,677 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:55:53,677 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:56:13,723 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:56:13,817 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:56:13,817 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Retrying exception running IssueRetryableCommand: Command returned a non-zero exit code.

2018-03-14 12:56:38,905 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-03-14 12:56:39,015 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Ran: {kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml}>  ReturnCode:1
STDOUT: 
STDERR: error: stat <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/kubeconfig1>: no such file or directory

2018-03-14 12:56:39,019 428788c1 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 733, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 666, in RunBenchmark
    DoCleanupPhase(spec, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 569, in DoCleanupPhase
    spec.BenchmarkCleanup(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 167, in Cleanup
    kubernetes_helper.DeleteAllFiles(getKubernetesScripts())
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/kubernetes_helper.py",> line 49, in DeleteAllFiles
    DeleteFromFile(file)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/kubernetes_helper.py",> line 44, in DeleteFromFile
    vm_util.IssueRetryableCommand(delete_cmd)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 249, in WrappedFunction
    return f(*args, **kwargs)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 387, in IssueRetryableCommand
    'Command returned a non-zero exit code.\n')
CalledProcessException: Command returned a non-zero exit code.

2018-03-14 12:56:39,020 428788c1 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-03-14 12:56:39,020 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-03-14 12:56:39,021 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/pkb.log>
2018-03-14 12:56:39,021 428788c1 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/428788c1/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #2

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/2/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 3c4a2001e5364657f6596bfc338f792e4797712d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 3c4a2001e5364657f6596bfc338f792e4797712d
Commit message: "[BEAM-3502] Remove usage of proto.Builder.clone() in DatastoreIO (#4449)"
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2237152919383760993.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2177900014142559695.sh
+ rm -rf .env
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4036582827776463186.sh
+ virtualenv .env --system-site-packages
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.env/bin/python>
Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7370458095403154352.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins452592715839845631.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: jinja2>=2.7 in /usr/local/lib/python2.7/dist-packages (from -r PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 16))
Requirement already satisfied: colorlog[windows]==2.6.0 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: blinker>=1.3 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 18))
Requirement already satisfied: futures>=3.0.3 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 19))
Requirement already satisfied: PyYAML==3.12 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 20))
Requirement already satisfied: pint>=0.7 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 21))
Requirement already satisfied: numpy in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 22))
Requirement already satisfied: functools32 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 23))
Requirement already satisfied: contextlib2>=0.5.1 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 24))
Requirement already satisfied: pywinrm in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: six in /home/jenkins/.local/lib/python2.7/site-packages (from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14))
Requirement already satisfied: MarkupSafe in /usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r PerfKitBenchmarker/requirements.txt (line 15))
Requirement already satisfied: colorama; extra == "windows" in /usr/lib/python2.7/dist-packages (from colorlog[windows]==2.6.0->-r PerfKitBenchmarker/requirements.txt (line 17))
Requirement already satisfied: xmltodict in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests-ntlm>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: requests>=2.9.1 in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: ntlm-auth>=1.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: cryptography>=1.3 in /home/jenkins/.local/lib/python2.7/site-packages (from requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: idna<2.6,>=2.5 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: urllib3<1.22,>=1.21.1 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: certifi>=2017.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: cffi>=1.7; platform_python_implementation != "PyPy" in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: enum34; python_version < "3" in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: asn1crypto>=0.21.0 in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: ipaddress; python_version < "3" in /usr/local/lib/python2.7/dist-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
Requirement already satisfied: pycparser in /home/jenkins/.local/lib/python2.7/site-packages (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25))
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4979635160790451769.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/sdks/python>
Requirement already satisfied: avro<2.0.0,>=1.8.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: crcmod<2.0,>=1.7 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: dill==0.2.6 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: grpcio<2,>=1.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: httplib2<0.10,>=0.8 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: oauth2client<5,>=2.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: pyyaml<4.0.0,>=3.12 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: six<1.12,>=1.9 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: futures<4.0.0,>=3.1.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: hdfs3<0.4.0,>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: google-apitools<=0.5.20,>=0.5.18 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: googledatastore==7.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: google-cloud-pubsub==0.26.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: google-cloud-bigquery==0.25.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.3.0.dev0)
Requirement already satisfied: enum34>=1.0.4 in /home/jenkins/.local/lib/python2.7/site-packages (from grpcio<2,>=1.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: funcsigs>=1; python_version < "3.3" in /home/jenkins/.local/lib/python2.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: pbr>=0.11 in /home/jenkins/.local/lib/python2.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: pyasn1>=0.1.7 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: pyasn1-modules>=0.0.5 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: rsa>=3.1.4 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.3.0.dev0)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages (from protobuf<4,>=3.5.0.post1->apache-beam==2.3.0.dev0)
Requirement already satisfied: fasteners>=0.14 in /home/jenkins/.local/lib/python2.7/site-packages (from google-apitools<=0.5.20,>=0.5.18->apache-beam==2.3.0.dev0)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in /home/jenkins/.local/lib/python2.7/site-packages (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: google-cloud-core<0.26dev,>=0.25.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: monotonic>=0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from fasteners>=0.14->google-apitools<=0.5.20,>=0.5.18->apache-beam==2.3.0.dev0)
Requirement already satisfied: google-auth<2.0.0dev,>=0.4.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: google-auth-httplib2 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: google-gax<0.16dev,>=0.15.7 in /home/jenkins/.local/lib/python2.7/site-packages (from gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: proto-google-cloud-pubsub-v1[grpc]<0.16dev,>=0.15.4 in /home/jenkins/.local/lib/python2.7/site-packages (from gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: grpc-google-iam-v1<0.12dev,>=0.11.1 in /home/jenkins/.local/lib/python2.7/site-packages (from gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: cachetools>=2.0.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-auth<2.0.0dev,>=0.4.0->google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: future<0.17dev,>=0.16.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: ply==3.8 in /home/jenkins/.local/lib/python2.7/site-packages (from google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: requests<3.0dev,>=2.13.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: idna<2.6,>=2.5 in /home/jenkins/.local/lib/python2.7/site-packages (from requests<3.0dev,>=2.13.0->google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: urllib3<1.22,>=1.21.1 in /home/jenkins/.local/lib/python2.7/site-packages (from requests<3.0dev,>=2.13.0->google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests<3.0dev,>=2.13.0->google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Requirement already satisfied: certifi>=2017.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from requests<3.0dev,>=2.13.0->google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.3.0.dev0)
Installing collected packages: apache-beam
  Found existing installation: apache-beam 2.3.0.dev0
    Not uninstalling apache-beam at /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_TextIOIT/src/sdks/python, outside environment <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.env>
  Running setup.py develop for apache-beam
Successfully installed apache-beam
[beam_PerformanceTests_TextIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins479342455195799723.sh
+ .env/bin/python PerfKitBenchmarker/pkb.py --project=apache-beam-testing --dpb_log_level=INFO --maven_binary=/home/jenkins/tools/maven/latest/bin/mvn --bigquery_table=beam_performance.textioit_hdfs_pkb_results --official=true --benchmarks=beam_integration_benchmark --beam_it_timeout=1200 --beam_it_profile=io-it --beam_prebuilt=true --beam_sdk=java --beam_it_module=sdks/java/io/file-based-io-tests --beam_it_class=org.apache.beam.sdk.io.text.TextIOIT '--beam_it_options=[--project=apache-beam-testing,--tempRoot=gs://temp-storage-for-perf-tests,--numberOfRecords=1000000]' '--beam_extra_mvn_properties=[filesystem=hdfs]' --beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/pkb-config.yml> --beam_kubernetes_scripts=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml,/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_TextIOIT_HDFS/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
Traceback (most recent call last):
  File "PerfKitBenchmarker/pkb.py", line 21, in <module>
    sys.exit(Main())
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 869, in Main
    SetUpPKB()
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 740, in SetUpPKB
    vm_util.GenTempDir()
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py",> line 121, in GenTempDir
    temp_dir.CreateTemporaryDirectories()
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/temp_dir.py",> line 62, in CreateTemporaryDirectories
    os.makedirs(path)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/.env/lib/python2.7/os.py",> line 157, in makedirs
    mkdir(name, mode)
OSError: [Errno 13] Permission denied: '/tmp/perfkitbenchmarker/runs/50b63e64'
Build step 'Execute shell' marked build as failure