You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/05/30 00:00:38 UTC

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #230

See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/230/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-4321] Enforce ErrorProne analysis in join-library extensions

[thw] [BEAM-4297] Streaming executable stage translation and operator for

[thw] [BEAM-4297] Add serialization test and make output mapping stable.

[kenn] Force load of Calcite JDBC driver (avoids internal error)

[kenn] Explicitly load Beam JDBC driver before test

[iemejia] [BEAM-3813] Support (de)serialization of S3 encryption options via JSON

[Pablo] [BEAM-4426]: Addressed DataflowDistributionAccumulatorTest failure on

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7b25821b0b473e075b4e410177daebfb381d4598 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7b25821b0b473e075b4e410177daebfb381d4598
Commit message: "[BEAM-4426]: Addressed DataflowDistributionAccumulatorTest failure on Windows"
 > git rev-list --no-walk 908fbcf28bb3ab5efeafe158ef29bd3e548e6500 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3348574653312234330.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6299397486815927116.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-230>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2857879596632759175.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-230> create namespace filebasedioithdfs-230
Error from server (AlreadyExists): namespaces "filebasedioithdfs-230" already exists
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PerformanceTests_XmlIOIT_HDFS #269

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/269/display/redirect>


Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #268

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/268/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4517] Add PyPI status badge.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 71151b569c3756dd9075abdf08ab3d6a649446ef (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 71151b569c3756dd9075abdf08ab3d6a649446ef
Commit message: "[BEAM-4517] Add PyPI status badge."
 > git rev-list --no-walk 492138bfe176d63ce7700cb90a2daf24deab03c3 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2653402760251955055.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2180683981877600108.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-268>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4196415902388380853.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-268> create namespace filebasedioithdfs-268
Error from server (AlreadyExists): namespaces "filebasedioithdfs-268" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #267

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/267/display/redirect?page=changes>

Changes:

[ehudm] Add a custom _url_dirname for local filesystems.

[github] Fix flaky comparison in log_handler_test.py

[jiangkai] add expression boolean casting

[jiangkai] add boolean type for agg function

[jiangkai] support EXISTS operator

[github] Update Environments.java

[github] Remove unneeded collection import.

[github] Combine immutability type fixes. (#3)

[github] fixup!

[lcwik] BEAM-3876 avoid NPE if checkpoint is null in an unbounded source

[lcwik] testing npe fix and exception rethrow in unit tests

[lcwik] Update https://github.com/apache/beam/pull/4894 to correspond with

------------------------------------------
[...truncated 435.21 KB...]
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:288)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    java.net.ConnectException: Call From xmlioit0writethenreadall--06071724-73bg-harness-rl38.c.apache-beam-testing.internal/10.128.0.5 to 190.239.211.130.bc.googleusercontent.com:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
    	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1479)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy64.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy65.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1648)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
    	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:924)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
    Caused by: java.net.ConnectException: Connection refused
    	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
    	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
    	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
    	at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
    	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1451)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy64.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy65.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1648)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
    	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:924)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:146)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:219)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:69)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:517)
    	at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:42)
    	at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:131)
    	at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:149)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
    	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:391)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:360)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:288)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Workflow failed. Causes: S03:Generate sequence/Read(BoundedCountingSource)+Create xml records/Map+Write xml files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write xml files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write xml files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write xml files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write xml files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+Write xml files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write xml files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
      xmlioit0writethenreadall--06071724-73bg-harness-rl38,
      xmlioit0writethenreadall--06071724-73bg-harness-rl38,
      xmlioit0writethenreadall--06071724-73bg-harness-7h58,
      xmlioit0writethenreadall--06071724-73bg-harness-rl38
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:348)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:329)
        at org.apache.beam.sdk.io.xml.XmlIOIT.writeThenReadAll(XmlIOIT.java:132)

1 test completed, 1 failed
Finished generating test XML results (0.033 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.039 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Task worker for ':' Thread 2,5,main]) completed. Took 8 mins 48.423 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 9m 1s
73 actionable tasks: 1 executed, 72 up-to-date

Publishing build scan...
https://gradle.com/s/35urbbg2wiho2


STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:583)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:794)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:761)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-06-08 00:32:40,679 c5079a2c MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 667, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 547, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-06-08 00:32:40,680 c5079a2c MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-06-08 00:32:40,680 c5079a2c MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-267> delete -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> --ignore-not-found
2018-06-08 00:34:43,211 c5079a2c MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 801, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 667, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 547, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-06-08 00:34:43,212 c5079a2c MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-06-08 00:34:43,265 c5079a2c MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-06-08 00:34:43,266 c5079a2c MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/c5079a2c/pkb.log>
2018-06-08 00:34:43,266 c5079a2c MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/c5079a2c/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #266

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/266/display/redirect?page=changes>

Changes:

[amaliujia] Add LIKE operator to Beam SQL.

[amaliujia] Address comments: 1. Use SqlFunction.like to implement like operator. 2.

[apilloud] [SQL] Use AutoService for jdbc

[apilloud] [SQL] AutoService for TableProvider

[apilloud] [SQL] Don't cache IT results

[apilloud] [SQL] Test that TextIO and Direct Runner work

[apilloud] [SQL] Add context class loader hack

[github] [BEAM-4480] Fixed deprecated method invoking

[cademarkegard] [BEAM-4326] Enforce ErrorProne analysis in the fn-execution project

[coheigea] Removing extraneous whitespace around the equals operator

------------------------------------------
[...truncated 361.99 KB...]
    	at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy65.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
    	at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
    	at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
    	at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1657)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.match(HadoopFileSystem.java:81)
    	at org.apache.beam.sdk.io.FileSystems.match(FileSystems.java:123)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeleteFileFn.processElement(FileBasedIOITHelper.java:90)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeleteFileFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:146)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
    	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:391)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:360)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:288)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    java.net.ConnectException: Call From xmlioit0writethenreadall--06071106-9gd6-harness-lq92.c.apache-beam-testing.internal/10.128.0.2 to 9.14.188.35.bc.googleusercontent.com:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
    	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1479)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy64.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
    	at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy65.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
    	at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
    	at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
    	at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1657)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.match(HadoopFileSystem.java:81)
    	at org.apache.beam.sdk.io.FileSystems.match(FileSystems.java:123)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeleteFileFn.processElement(FileBasedIOITHelper.java:90)
    Caused by: java.net.ConnectException: Connection refused
    	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
    	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
    	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
    	at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
    	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1451)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy64.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
    	at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy65.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
    	at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
    	at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
    	at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1657)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.match(HadoopFileSystem.java:81)
    	at org.apache.beam.sdk.io.FileSystems.match(FileSystems.java:123)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeleteFileFn.processElement(FileBasedIOITHelper.java:90)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeleteFileFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:146)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
    	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:391)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:360)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:288)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Workflow failed. Causes: S33:Delete test files failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
      xmlioit0writethenreadall--06071106-9gd6-harness-lq92,
      xmlioit0writethenreadall--06071106-9gd6-harness-lq92,
      xmlioit0writethenreadall--06071106-9gd6-harness-lq92,
      xmlioit0writethenreadall--06071106-9gd6-harness-lq92
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:348)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:329)
        at org.apache.beam.sdk.io.xml.XmlIOIT.writeThenReadAll(XmlIOIT.java:132)

1 test completed, 1 failed
Finished generating test XML results (0.027 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.033 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Task worker for ':' Thread 9,5,main]) completed. Took 5 mins 36.953 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 48s
73 actionable tasks: 1 executed, 72 up-to-date

Publishing build scan...
https://gradle.com/s/r2gtrfccepgim


STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:583)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:794)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:761)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-06-07 18:11:50,932 d5f0bb03 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 667, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 547, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-06-07 18:11:50,933 d5f0bb03 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-06-07 18:11:50,933 d5f0bb03 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-beam-performancetests-xmlioit-hdfs-266> delete -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> --ignore-not-found
2018-06-07 18:13:53,393 d5f0bb03 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 801, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 667, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 547, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-06-07 18:13:53,394 d5f0bb03 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-06-07 18:13:53,442 d5f0bb03 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-06-07 18:13:53,443 d5f0bb03 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/d5f0bb03/pkb.log>
2018-06-07 18:13:53,443 d5f0bb03 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/d5f0bb03/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #265

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/265/display/redirect>

------------------------------------------
[...truncated 376.27 KB...]
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:288)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    java.net.ConnectException: Call From xmlioit0writethenreadall--06070709-yliq-harness-8lk5.c.apache-beam-testing.internal/10.128.0.2 to 55.145.184.35.bc.googleusercontent.com:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
    	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1479)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy64.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy65.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1648)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
    	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:924)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
    Caused by: java.net.ConnectException: Connection refused
    	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
    	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
    	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
    	at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
    	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1451)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy64.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy65.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1648)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
    	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:924)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:146)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:219)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:69)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:517)
    	at org.apache.beam.sdk.transforms.DoFnOutputReceivers$WindowedContextOutputReceiver.output(DoFnOutputReceivers.java:42)
    	at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:131)
    	at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:185)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:149)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
    	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:391)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:360)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:288)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Workflow failed. Causes: S03:Generate sequence/Read(BoundedCountingSource)+Create xml records/Map+Write xml files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write xml files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write xml files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write xml files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write xml files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+Write xml files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write xml files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
      xmlioit0writethenreadall--06070709-yliq-harness-8lk5,
      xmlioit0writethenreadall--06070709-yliq-harness-8lk5,
      xmlioit0writethenreadall--06070709-yliq-harness-8lk5,
      xmlioit0writethenreadall--06070709-yliq-harness-8lk5
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:348)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:329)
        at org.apache.beam.sdk.io.xml.XmlIOIT.writeThenReadAll(XmlIOIT.java:132)

1 test completed, 1 failed
Finished generating test XML results (0.023 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.027 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Task worker for ':' Thread 12,5,main]) completed. Took 3 mins 54.397 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 3s
73 actionable tasks: 1 executed, 72 up-to-date

Publishing build scan...
https://gradle.com/s/xxlxl3ui5n5ki


STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:583)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:794)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:761)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-06-07 14:12:48,909 262d2e3d MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 667, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 547, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-06-07 14:12:48,910 262d2e3d MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-06-07 14:12:48,910 262d2e3d MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-beam-performancetests-xmlioit-hdfs-265> delete -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> --ignore-not-found
2018-06-07 14:14:43,631 262d2e3d MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 801, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 667, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 547, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-06-07 14:14:43,632 262d2e3d MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-06-07 14:14:43,689 262d2e3d MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-06-07 14:14:43,689 262d2e3d MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/262d2e3d/pkb.log>
2018-06-07 14:14:43,690 262d2e3d MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/262d2e3d/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #264

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/264/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 51b6adf3f9376ab32218abd4f08fd9b6dfa364df (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 51b6adf3f9376ab32218abd4f08fd9b6dfa364df
Commit message: "[BEAM-4517] Add maven status in README"
 > git rev-list --no-walk 51b6adf3f9376ab32218abd4f08fd9b6dfa364df # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins31785821938502047.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8658562360520856146.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-264>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8276610630931511466.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-264> create namespace filebasedioithdfs-264
Error from server (AlreadyExists): namespaces "filebasedioithdfs-264" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #263

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/263/display/redirect?page=changes>

Changes:

[github] [BEAM-4517] Add maven status in README

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 51b6adf3f9376ab32218abd4f08fd9b6dfa364df (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 51b6adf3f9376ab32218abd4f08fd9b6dfa364df
Commit message: "[BEAM-4517] Add maven status in README"
 > git rev-list --no-walk f89e19ba2983363e049115c26e49477e7f658ac2 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8774542296913292455.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1374785018920161147.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-263>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6602100023600783143.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-263> create namespace filebasedioithdfs-263
Error from server (AlreadyExists): namespaces "filebasedioithdfs-263" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #262

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/262/display/redirect?page=changes>

Changes:

[szewinho] [BEAM-3214] Add integration test for HBaseIO.

[axelmagn] Fix ProvisionInfo in DockerJobBundleFactory

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision f89e19ba2983363e049115c26e49477e7f658ac2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f f89e19ba2983363e049115c26e49477e7f658ac2
Commit message: "Merge pull request #5499: [BEAM-3214] Add integration test for HBaseIO."
 > git rev-list --no-walk 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4668099613338417426.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1373052280453962566.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-262>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins104901659497018997.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-262> create namespace filebasedioithdfs-262
Error from server (AlreadyExists): namespaces "filebasedioithdfs-262" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #261

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/261/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37
Commit message: "Moving to 2.6.0-SNAPSHOT on master branch"
 > git rev-list --no-walk 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6647651646722716607.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6050753238734858302.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-261>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1044128836986557744.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-261> create namespace filebasedioithdfs-261
Error from server (AlreadyExists): namespaces "filebasedioithdfs-261" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #260

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/260/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37
Commit message: "Moving to 2.6.0-SNAPSHOT on master branch"
 > git rev-list --no-walk 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7523760987503563991.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2516702660787844453.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-260>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5763705489880973284.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-260> create namespace filebasedioithdfs-260
Error from server (AlreadyExists): namespaces "filebasedioithdfs-260" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #259

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/259/display/redirect?page=changes>

Changes:

[alan] [BEAM-4423] Mark pull requests stale after 60 days; close 7 days after

[jbonofre] Moving to 2.6.0-SNAPSHOT on master branch

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 72cbd99d6b62bc7ed16dbd1288cd61d54e8bda37
Commit message: "Moving to 2.6.0-SNAPSHOT on master branch"
 > git rev-list --no-walk 5b9faa416059299d0fe442b67dea5b3c1cb3d83b # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3388299392430038802.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins497939180305320190.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-259>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins105848776242841256.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-259> create namespace filebasedioithdfs-259
Error from server (AlreadyExists): namespaces "filebasedioithdfs-259" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #258

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/258/display/redirect?page=changes>

Changes:

[rober] [BEAM-4276] Add combiner lifting support to Go SDK

[rober] fixup! Address comments.

[lcwik] [BEAM-4481] Remove duplicate definitions of dependencies.

[github] Improve default value for experiments set in RuntimeValueProvider

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 5b9faa416059299d0fe442b67dea5b3c1cb3d83b (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 5b9faa416059299d0fe442b67dea5b3c1cb3d83b
Commit message: "[BEAM-4276] Add combiner lifting support to Go SDK"
 > git rev-list --no-walk c1743ccae68a57b46cff3bb13441fb2fbc55e511 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5050179319959696556.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2191640849729469897.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-258>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8052042295114095276.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-258> create namespace filebasedioithdfs-258
Error from server (AlreadyExists): namespaces "filebasedioithdfs-258" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #257

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/257/display/redirect?page=changes>

Changes:

[ajamato] Add new metrics protos based on s.apache.org/beam-fn-api-metrics

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision c1743ccae68a57b46cff3bb13441fb2fbc55e511 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c1743ccae68a57b46cff3bb13441fb2fbc55e511
Commit message: "[BEAM-3926] Add new metrics protos based on "Defining and adding SDK Metrics" htt…"
 > git rev-list --no-walk 73cc33292dee500c0e9f1a072f4d516f753f8e9d # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3204586369745478553.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2761906552548556593.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-257>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5547393046214228792.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-257> create namespace filebasedioithdfs-257
Error from server (AlreadyExists): namespaces "filebasedioithdfs-257" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #256

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/256/display/redirect?page=changes>

Changes:

[herohde] Tweak Go integration test driver

[iemejia] [BEAM-4310] Enforce ErrorProne analysis in

[relax] Now that Dataflow is updated, we can finish updating tests and usages to

[iemejia] [BEAM-4137] Remove MongoDB specific options from

[relax] Add a test with ReferenceRunner

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 73cc33292dee500c0e9f1a072f4d516f753f8e9d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 73cc33292dee500c0e9f1a072f4d516f753f8e9d
Commit message: "Merge pull request #5546 from iemejia/BEAM-4310-errorprone-runners-extensions-java-metrics"
 > git rev-list --no-walk abe3f3edd5e34245f0a5fef63ce48fef57204b41 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1953005281578398349.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7959437948750192207.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-256>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins598087418905178312.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-256> create namespace filebasedioithdfs-256
Error from server (AlreadyExists): namespaces "filebasedioithdfs-256" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #255

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/255/display/redirect?page=changes>

Changes:

[herohde] Fix build break due to artifact staging change

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision abe3f3edd5e34245f0a5fef63ce48fef57204b41 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f abe3f3edd5e34245f0a5fef63ce48fef57204b41
Commit message: "Merge pull request #5558: Fix build break due to artifact staging change"
 > git rev-list --no-walk be6185fd5c2b471f371d7591108360974954ec12 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4914287454551231416.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2804809274711922889.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-255>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3832553775568345163.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-255> create namespace filebasedioithdfs-255
Error from server (AlreadyExists): namespaces "filebasedioithdfs-255" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #254

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/254/display/redirect?page=changes>

Changes:

[ankurgoenka] proto changes to support artifact_staging_id

[ankurgoenka] Doc changes

[ankurgoenka] Renaming artifact_staging_id to staging_session_token

[ankurgoenka] Fixing python code for staging_session_token proto changes

[ankurgoenka] Fixing java code for staging_session_token proto changes

[ankurgoenka] Go proto update

[ankurgoenka] Code review comments fixes

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision be6185fd5c2b471f371d7591108360974954ec12 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f be6185fd5c2b471f371d7591108360974954ec12
Commit message: "Merge pull request #5489: [BEAM-4290] proto changes to support staging_session_token"
 > git rev-list --no-walk 697a1d17e473cd5b097aaaeee24c08f43cc77f58 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4038805679433242819.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins9088843374459220172.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-254>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6128590271259489232.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-254> create namespace filebasedioithdfs-254
Error from server (AlreadyExists): namespaces "filebasedioithdfs-254" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #253

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/253/display/redirect?page=changes>

Changes:

[apilloud] [SQL] Init compiler factory with JDBC class loader

[kedin] [SQL] Add ParseException to sqlEnv.explain()

[kedin] [SQL] Add integration tests for BigQuery writes

[kedin] Add TestBigQuery rule

[altay] Move assert_that's new keyword argument to the end

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 697a1d17e473cd5b097aaaeee24c08f43cc77f58 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 697a1d17e473cd5b097aaaeee24c08f43cc77f58
Commit message: "Merge pull request #5508: [SQL] Add integration tests for BigQuery writes"
 > git rev-list --no-walk 56dc4cfc74495770e772a81779e719721016efb3 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins78140263178537016.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6018405538067529800.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-253>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4022945472379894727.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-253> create namespace filebasedioithdfs-253
Error from server (AlreadyExists): namespaces "filebasedioithdfs-253" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #252

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/252/display/redirect?page=changes>

Changes:

[jhsueh] [BEAM-4320] Enforce ErrorProne analysis in jackson extensions project

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 56dc4cfc74495770e772a81779e719721016efb3 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 56dc4cfc74495770e772a81779e719721016efb3
Commit message: "Merge pull request #5543: [BEAM-4320] Enforce ErrorProne analysis in jackson extensions"
 > git rev-list --no-walk 61721d03d153de4d176ee43d59a54e50eb1d2578 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins840834518996768784.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8910033300604649873.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-252>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1997493706095510907.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-252> create namespace filebasedioithdfs-252
Error from server (AlreadyExists): namespaces "filebasedioithdfs-252" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #251

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/251/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 61721d03d153de4d176ee43d59a54e50eb1d2578 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 61721d03d153de4d176ee43d59a54e50eb1d2578
Commit message: "[BEAM-4406] Updating portable Dataflow major version numbers"
 > git rev-list --no-walk 61721d03d153de4d176ee43d59a54e50eb1d2578 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins9080485315696988487.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8812356023809240675.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-251>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5848474067721969646.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-251> create namespace filebasedioithdfs-251
Error from server (AlreadyExists): namespaces "filebasedioithdfs-251" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #250

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/250/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 61721d03d153de4d176ee43d59a54e50eb1d2578 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 61721d03d153de4d176ee43d59a54e50eb1d2578
Commit message: "[BEAM-4406] Updating portable Dataflow major version numbers"
 > git rev-list --no-walk 61721d03d153de4d176ee43d59a54e50eb1d2578 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5778894884397206246.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8312598515006809603.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-250>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8791991608917713257.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-250> create namespace filebasedioithdfs-250
Error from server (AlreadyExists): namespaces "filebasedioithdfs-250" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #249

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/249/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 61721d03d153de4d176ee43d59a54e50eb1d2578 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 61721d03d153de4d176ee43d59a54e50eb1d2578
Commit message: "[BEAM-4406] Updating portable Dataflow major version numbers"
 > git rev-list --no-walk 61721d03d153de4d176ee43d59a54e50eb1d2578 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5180333660541361570.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5556081152357896106.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-249>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6115746497140283251.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-249> create namespace filebasedioithdfs-249
Error from server (AlreadyExists): namespaces "filebasedioithdfs-249" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #248

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/248/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 61721d03d153de4d176ee43d59a54e50eb1d2578 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 61721d03d153de4d176ee43d59a54e50eb1d2578
Commit message: "[BEAM-4406] Updating portable Dataflow major version numbers"
 > git rev-list --no-walk 61721d03d153de4d176ee43d59a54e50eb1d2578 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8681820129525224191.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins270916854014494334.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-248>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6294911463266257527.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-248> create namespace filebasedioithdfs-248
Error from server (AlreadyExists): namespaces "filebasedioithdfs-248" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #247

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/247/display/redirect?page=changes>

Changes:

[daniel.o.programmer] [BEAM-4406] Updating Java portable environment major version.

[daniel.o.programmer] [BEAM-4406] Updating Python portable environment major version.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 61721d03d153de4d176ee43d59a54e50eb1d2578 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 61721d03d153de4d176ee43d59a54e50eb1d2578
Commit message: "[BEAM-4406] Updating portable Dataflow major version numbers"
 > git rev-list --no-walk b1c25fe54744e90600f8b67413021aab5162ba59 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4798134106176794380.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7145632514320837570.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-247>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2440417662767870670.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-247> create namespace filebasedioithdfs-247
Error from server (AlreadyExists): namespaces "filebasedioithdfs-247" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #246

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/246/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b1c25fe54744e90600f8b67413021aab5162ba59 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b1c25fe54744e90600f8b67413021aab5162ba59
Commit message: "[BEAM-4253] Update dataflow worker to fix ParDoTest fails in DataflowValidatesRunner suite"
 > git rev-list --no-walk b1c25fe54744e90600f8b67413021aab5162ba59 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8737087171147853689.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4478751324490970374.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-246>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8928121342473011926.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-246> create namespace filebasedioithdfs-246
Error from server (AlreadyExists): namespaces "filebasedioithdfs-246" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #245

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/245/display/redirect?page=changes>

Changes:

[amyrvold] [BEAM-4253] Update dataflow worker to fix ParDoTest fails in Dataflow

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b1c25fe54744e90600f8b67413021aab5162ba59 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b1c25fe54744e90600f8b67413021aab5162ba59
Commit message: "[BEAM-4253] Update dataflow worker to fix ParDoTest fails in DataflowValidatesRunner suite"
 > git rev-list --no-walk 4511734b9cae49eca785b051e5a57c5bacd87952 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2105992306062468075.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4635967651976343909.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-245>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4541929720019279924.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-245> create namespace filebasedioithdfs-245
Error from server (AlreadyExists): namespaces "filebasedioithdfs-245" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #244

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/244/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4511734b9cae49eca785b051e5a57c5bacd87952 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4511734b9cae49eca785b051e5a57c5bacd87952
Commit message: "Merge pull request #5535: Use Docker in the ReferenceRunner"
 > git rev-list --no-walk 4511734b9cae49eca785b051e5a57c5bacd87952 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2729463177438417470.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5785566344321433794.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-244>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3427657442769037119.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-244> create namespace filebasedioithdfs-244
Error from server (AlreadyExists): namespaces "filebasedioithdfs-244" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #243

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/243/display/redirect?page=changes>

Changes:

[tgroh] Use the Current Users Container as the Environment

[tgroh] Enable Docker in the PortableDirectRunner

[tgroh] Move DirectJobBundleFactory to fn-execution

[tgroh] Cleanups to the PortableDirectRunner

[tgroh] Cleanups and Wiring for the ReferenceRunnerJobService

[tgroh] Use an UnsupportedArtifactRetrievalService

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 4511734b9cae49eca785b051e5a57c5bacd87952 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 4511734b9cae49eca785b051e5a57c5bacd87952
Commit message: "Merge pull request #5535: Use Docker in the ReferenceRunner"
 > git rev-list --no-walk e33da0da577b062d796dc33032214cd4846092b4 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2017923642224485967.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4211521469233559880.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-243>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8213194465747911628.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-243> create namespace filebasedioithdfs-243
Error from server (AlreadyExists): namespaces "filebasedioithdfs-243" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #242

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/242/display/redirect?page=changes>

Changes:

[kedin] [SQL] Replace planner.compilePipeline() with sqlEnv.parseQuery()

[kedin] [SQL] Add sqlEnv.executeDdl()

[kedin] [SQL] Make planner package-private

[kedin] [SQL] Add factory methods to BeamSqlEnv

[kedin] [SQL] Rename ReadOnlyTableProvider

[cademarkegard] [BEAM-4303] Enforce ErrorProne analysis in examples project

[kedin] [SQL] Wrap SQL parsing exceptions in ParseException

[tgroh] Add an abstraction for State and Timers

[tgroh] DirectRunner Cleanups

[tgroh] Link up the Portable DirectRunner

[tgroh] Reuse ID Generators across Environments

[aromanenko.dev] [BEAM-4421] Fix for issue with reading s3 files using ParquetIO

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision e33da0da577b062d796dc33032214cd4846092b4 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f e33da0da577b062d796dc33032214cd4846092b4
Commit message: "Merge pull request #5530: [SQL] BeamSqlEnv refactor"
 > git rev-list --no-walk 29066a4b8fc4e0dd4b14b1373e2dc35c28e2e8e0 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8373307969922470032.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4552011880182804144.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-242>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4418684147303798074.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-242> create namespace filebasedioithdfs-242
Error from server (AlreadyExists): namespaces "filebasedioithdfs-242" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #241

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/241/display/redirect?page=changes>

Changes:

[github] Undefined names: import MetricKey, MetricName

[boyuanz] Address BEAM-4328: :beam-sdks-java-io-google-cloud-platform:test failure

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 29066a4b8fc4e0dd4b14b1373e2dc35c28e2e8e0 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 29066a4b8fc4e0dd4b14b1373e2dc35c28e2e8e0
Commit message: "[BEAM-4328]: beam-sdks-java-io-google-cloud-platform:test failure"
 > git rev-list --no-walk 684850be402ccce74ccce1d991b4fe67bf665f02 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7483905475328172071.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins9170029964576887880.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-241>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4575515580237850082.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-241> create namespace filebasedioithdfs-241
Error from server (AlreadyExists): namespaces "filebasedioithdfs-241" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #240

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/240/display/redirect?page=changes>

Changes:

[coheigea] Fixing ErrorProne MissingOverrides warnings

[coheigea] Fixing some ErrorProne warnings

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 684850be402ccce74ccce1d991b4fe67bf665f02 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 684850be402ccce74ccce1d991b4fe67bf665f02
Commit message: "Merge pull request #5524: Fixing some ErrorProne warnings"
 > git rev-list --no-walk 175692f3f095c0b5f11f2e20577350ba0700d722 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4701520600091222206.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2487109115287234392.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-240>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3407860953718117925.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-240> create namespace filebasedioithdfs-240
Error from server (AlreadyExists): namespaces "filebasedioithdfs-240" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #239

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/239/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 175692f3f095c0b5f11f2e20577350ba0700d722 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 175692f3f095c0b5f11f2e20577350ba0700d722
Commit message: "clean generated files prior to lint"
 > git rev-list --no-walk 175692f3f095c0b5f11f2e20577350ba0700d722 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5514961195819808423.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins361381372855547784.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-239>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3322801536456607709.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-239> create namespace filebasedioithdfs-239
Error from server (AlreadyExists): namespaces "filebasedioithdfs-239" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #238

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/238/display/redirect?page=changes>

Changes:

[tgroh] Add a test to show Flatten Execution

[kedin] [SQL] Fix PubsubsJsonIT

[Pablo] clean generated files prior to lint

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 175692f3f095c0b5f11f2e20577350ba0700d722 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 175692f3f095c0b5f11f2e20577350ba0700d722
Commit message: "clean generated files prior to lint"
 > git rev-list --no-walk 1414dd56e79e73fe944e2af38db5730cd612e4e5 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6607904413435617828.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2850876453869660301.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-238>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4005827044931150821.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-238> create namespace filebasedioithdfs-238
Error from server (AlreadyExists): namespaces "filebasedioithdfs-238" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #237

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/237/display/redirect?page=changes>

Changes:

[apilloud] Run test on shadowJar on a flag

[apilloud] [SQL] Build mega jar for JDBC

[apilloud] [SQL] JDBC Class Loader IT

[coheigea] Avoid fully qualified class names, when there is an existing (static)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 1414dd56e79e73fe944e2af38db5730cd612e4e5 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 1414dd56e79e73fe944e2af38db5730cd612e4e5
Commit message: "Merge pull request #4748: Avoid fully qualified class names, when there is an existing (static)…"
 > git rev-list --no-walk a0d1ad1c5e2aa89d458d609a83e9a0975e768ad9 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5224045474307794833.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6846373940477497526.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-237>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4228457763046619599.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-237> create namespace filebasedioithdfs-237
Error from server (AlreadyExists): namespaces "filebasedioithdfs-237" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #236

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/236/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-4323] Enforce ErrorProne analysis in sketching extensions

[iemejia] [BEAM-4356] Enforce ErrorProne analysis in nexmark

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a0d1ad1c5e2aa89d458d609a83e9a0975e768ad9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f a0d1ad1c5e2aa89d458d609a83e9a0975e768ad9
Commit message: "Merge pull request #5513 from iemejia/BEAM-4356-errorprone-nexmark"
 > git rev-list --no-walk d506e1ef8968bf97432b522bf8589db4f1be29f6 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8332475114072752159.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6433075283205592077.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-236>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1064739421730956114.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-236> create namespace filebasedioithdfs-236
Error from server (AlreadyExists): namespaces "filebasedioithdfs-236" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #235

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/235/display/redirect?page=changes>

Changes:

[kenn] Remove Schema.TypeName.type()

[kenn] Remove unsafe builder methods from FieldType

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision d506e1ef8968bf97432b522bf8589db4f1be29f6 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f d506e1ef8968bf97432b522bf8589db4f1be29f6
Commit message: "Merge pull request #5498: [BEAM-4076] Remove unsafe methods from Schema.TypeName and Schema.FieldType"
 > git rev-list --no-walk 123c20e9ad754cc1142a0aed17edbbef637b6176 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2087248872860007196.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4117782762051566668.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-235>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4125654345617137736.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-235> create namespace filebasedioithdfs-235
Error from server (AlreadyExists): namespaces "filebasedioithdfs-235" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #234

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/234/display/redirect?page=changes>

Changes:

[szewinho] HDFS large cluster configuration. Jenkins job updated to use large

[apilloud] Run everything with shadowJar

[apilloud] guava is on kinesis public API

[apilloud] Fix direct runner shadow config

[kenn] Add more field conveniences to Schema.Builder

[kenn] Remove extraneous RowSqlTypes

[Pablo] [BEAM-4322] Enforce ErrorProne analysis in protobuf extensions project

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam8 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 123c20e9ad754cc1142a0aed17edbbef637b6176 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 123c20e9ad754cc1142a0aed17edbbef637b6176
Commit message: "Merge pull request #5441: [BEAM-3060] HDFS large cluster configuration"
 > git rev-list --no-walk 856d842e7700bd26b6292f9faf7db9ddd85a55fc # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins1457853594180533125.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5343629873352771267.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-234>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3448502326175199650.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-234> create namespace filebasedioithdfs-234
Error from server (AlreadyExists): namespaces "filebasedioithdfs-234" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #233

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/233/display/redirect?page=changes>

Changes:

[kenn] Add equals, hashCode, and structuralValue to Nexmark model objects

[cademarkegard] fix errorprone annotations and enable failOnWarning

[iemejia] Update postgres version to 42.2.2 (Java 8)

[coheigea] Update README to include virtualenv in the build instructions

[kenn] Finish RowType -> Schema rename

[kenn] Add convenience aliases for nullary type constructors

[kenn] Add MAP field support to Schema.Builder

[kenn] Add static factories for array, row, map types

[kenn] Schema fields are non-null by default

[kenn] TestUtils.RowsBuilder to use FieldType instead of TypeName

[kenn] Clean up CalciteUtils conversions a bit

[Pablo] enable failOnWarning for build-tools

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 856d842e7700bd26b6292f9faf7db9ddd85a55fc (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 856d842e7700bd26b6292f9faf7db9ddd85a55fc
Commit message: "Merge pull request #5490: Refactor schemas and fields, simplify a bit"
 > git rev-list --no-walk b5e9761e7793d824fde3de0028fea7bcde762584 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5603326570726039970.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4422046668660863954.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-233>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7590213382329426923.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-233> create namespace filebasedioithdfs-233
Error from server (AlreadyExists): namespaces "filebasedioithdfs-233" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #232

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/232/display/redirect?page=changes>

Changes:

[timrobertson100] [BEAM-4389] Enable partial updates in ElasticsearchIO

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision b5e9761e7793d824fde3de0028fea7bcde762584 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f b5e9761e7793d824fde3de0028fea7bcde762584
Commit message: "Merge pull request #5463 from timrobertson100/BEAM-4389"
 > git rev-list --no-walk 713873f80f2417450fd5d721f2c09ae3f20e0234 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3398096643416550109.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins588372219774018666.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-232>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins799064770217748831.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-232> create namespace filebasedioithdfs-232
Error from server (AlreadyExists): namespaces "filebasedioithdfs-232" already exists
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #231

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/231/display/redirect?page=changes>

Changes:

[github] Make hash function in Coder base class more conservative.

[aaltay] Support installing Beam SDK from a wheel distribution in SDK containers.

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on beam10 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/>
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 713873f80f2417450fd5d721f2c09ae3f20e0234 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 713873f80f2417450fd5d721f2c09ae3f20e0234
Commit message: "Merge pull request #5429 from tvalentyn/patch-15"
 > git rev-list --no-walk 7b25821b0b473e075b4e410177daebfb381d4598 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins8931601299395437984.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2711364500808656563.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-231>
[beam_PerformanceTests_XmlIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins2774609381152749533.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-231> create namespace filebasedioithdfs-231
Error from server (AlreadyExists): namespaces "filebasedioithdfs-231" already exists
Build step 'Execute shell' marked build as failure