You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/05/03 06:05:50 UTC

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #129

See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/129/display/redirect?page=changes>

Changes:

[tgroh] Add KV and Window Coder Utilities to ModelCoders

[tgroh] Use ModelCoders in ProcessBundleDescriptors

[sidhom] Remove StateRequestHandler from JobBundleFactory

------------------------------------------
[...truncated 50.91 KB...]
> Task :beam-sdks-java-build-tools:jar
> Task :beam-sdks-java-maven-archetypes-starter:processResources
> Task :beam-sdks-java-maven-archetypes-starter:processTestResources
> Task :beam-model-pipeline:extractIncludeProto
> Task :beam-model-pipeline:extractProto
> Task :beam-sdks-java-io-elasticsearch-tests-common:shadowJar
> Task :beam-sdks-java-io-elasticsearch-tests-common:processTestResources NO-SOURCE
> Task :beam-sdks-java-io-elasticsearch-tests-common:jar
> Task :beam-sdks-java-maven-archetypes-examples:generateSources
> Task :beam-sdks-java-extensions-sql:copyFmppTemplatesFromCalciteCore
> Task :beam-sdks-java-extensions-sql:copyFmppTemplatesFromSrc
> Task :beam-model-pipeline:generateProto
> Task :beam-sdks-java-maven-archetypes-examples:processResources
> Task :beam-sdks-java-maven-archetypes-examples:processTestResources

> Task :beam-runners-apex:buildDependencyTree
See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/68578aa7/beam/runners/apex/build/classes/java/main/org/apache/beam/runners/apex/dependency-tree>

> Task :beam-runners-apex:processResources NO-SOURCE
> Task :beam-runners-apex:processTestResources
> Task :beam-sdks-java-build-tools:compileTestJava FROM-CACHE
> Task :beam-sdks-java-build-tools:processTestResources NO-SOURCE
> Task :beam-sdks-java-build-tools:testClasses UP-TO-DATE
> Task :beam-sdks-java-build-tools:packageTests
> Task :beam-sdks-java-extensions-sql:generateFmppSources
> Task :beam-sdks-java-build-tools:install

> Task :beam-sdks-python:setupVirtualenv
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/68578aa7/beam/sdks/python/build/gradleenv/bin/python2>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/68578aa7/beam/sdks/python/build/gradleenv/bin/python>
Installing setuptools, pkg_resources, pip, wheel...
> Task :beam-sdks-java-core:compileJava FROM-CACHE

> Task :beam-sdks-java-extensions-sql:compileJavacc
Java Compiler Compiler Version 6.1_2 (Parser Generator)
(type "javacc" with no arguments for help)
Reading from file <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/68578aa7/beam/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj> . . .
Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true to force checking.
Note: UNICODE_INPUT option is specified. Please make sure you create the parser/lexer using a Reader with the correct character encoding.
File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :beam-sdks-java-core:processResources
> Task :beam-sdks-java-core:classes
> Task :beam-sdks-java-extensions-sql:processResources
> Task :beam-sdks-java-extensions-sql:processTestResources NO-SOURCE

> Task :beam-sdks-python:setupVirtualenv
done.
Running virtualenv with interpreter /usr/bin/python2
Requirement already up-to-date: tox==3.0.0 in /home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Collecting grpcio-tools==1.3.5
  Using cached https://files.pythonhosted.org/packages/05/f6/0296e29b1bac6f85d2a8556d48adf825307f73109a3c2c17fb734292db0a/grpcio_tools-1.3.5-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (0.6.0)
Requirement not upgraded as not directly required: six in /usr/local/lib/python2.7/dist-packages (from tox==3.0.0) (1.11.0)
Requirement not upgraded as not directly required: virtualenv>=1.11.2 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (15.2.0)
Requirement not upgraded as not directly required: py>=1.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (1.5.3)
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl

> Task :beam-model-pipeline:compileJava
> Task :beam-model-pipeline:processResources
> Task :beam-model-pipeline:classes
> Task :beam-model-pipeline:shadowJar
> Task :beam-model-job-management:extractIncludeProto
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-pipeline:jar
> Task :beam-model-job-management:generateProto
> Task :beam-model-pipeline:extractIncludeTestProto
> Task :beam-model-pipeline:extractTestProto
> Task :beam-model-pipeline:generateTestProto NO-SOURCE
> Task :beam-model-pipeline:compileTestJava NO-SOURCE
> Task :beam-model-pipeline:processTestResources NO-SOURCE
> Task :beam-model-pipeline:testClasses UP-TO-DATE
> Task :beam-model-pipeline:packageTests
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-pipeline:install

> Task :beam-sdks-python:setupVirtualenv
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/9d/61/54c3a9cfde6ffe0ca6a1786ddb8874263f4ca32e7693ad383bd8cf935015/protobuf-3.5.2.post1-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: enum34>=1.0.4 in /usr/local/lib/python2.7/dist-packages (from grpcio>=1.3.5->grpcio-tools==1.3.5) (1.1.6)
Collecting futures>=2.2.0 (from grpcio>=1.3.5->grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement not upgraded as not directly required: setuptools in /usr/local/lib/python2.7/dist-packages (from protobuf>=3.2.0->grpcio-tools==1.3.5) (39.0.1)
Installing collected packages: protobuf, futures, grpcio, grpcio-tools
Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/local/lib/python2.7/dist-packages/protobuf-3.5.2.post1-py2.7-nspkg.pth'
Consider using the `--user` option or check the permissions.


> Task :beam-sdks-python:setupVirtualenv FAILED
> Task :beam-model-job-management:compileJava
> Task :beam-model-fn-execution:compileJava
> Task :beam-sdks-java-core:shadowJar

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38s
130 actionable tasks: 126 executed, 2 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/zlvf7src5lyhy


STDERR: 
FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/68578aa7/beam/sdks/python/build.gradle'> line: 36

* What went wrong:
Execution failed for task ':beam-sdks-python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-python:setupVirtualenv'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'command 'sh'' finished with non-zero exit value 1
	at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:389)
	at org.gradle.process.internal.DefaultExecAction.execute(DefaultExecAction.java:36)
	at org.gradle.api.internal.file.DefaultFileOperations.exec(DefaultFileOperations.java:192)
	at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1087)
	at org.gradle.groovy.scripts.DefaultScript.exec(DefaultScript.java:253)
	at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.invokeMethod(BeanDynamicObject.java:479)
	at org.gradle.internal.metaobject.BeanDynamicObject.tryInvokeMethod(BeanDynamicObject.java:191)
	at org.gradle.groovy.scripts.BasicScript$ScriptDynamicObject.tryInvokeMethod(BasicScript.java:130)
	at org.gradle.internal.metaobject.ConfigureDelegate.invokeMethod(ConfigureDelegate.java:78)
	at build_egyhdigiaz7tex8f9q59ojrjv$_run_closure2$_closure16.doCall(<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/68578aa7/beam/sdks/python/build.gradle>:36)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:732)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:705)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-03 06:02:46,189 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-05-03 06:02:46,455 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-05-03 06:02:46,624 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2018-05-03 06:02:46,630 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:02:56,781 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:03:06,908 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:03:17,044 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:03:27,196 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:03:37,343 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:03:47,468 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:03:57,593 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:04:07,717 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:04:17,841 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:04:27,970 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:04:38,097 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:04:48,224 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:04:58,353 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:05:08,482 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:05:18,602 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:05:28,777 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:05:38,906 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:05:49,039 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 06:05:49,161 68578aa7 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-03 06:05:49,162 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-03 06:05:49,162 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-03 06:05:49,528 68578aa7 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525323724735> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-03 06:05:49,698 68578aa7 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 780, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-03 06:05:49,699 68578aa7 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-03 06:05:49,750 68578aa7 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-03 06:05:49,751 68578aa7 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/68578aa7/pkb.log>
2018-05-03 06:05:49,751 68578aa7 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/68578aa7/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PerformanceTests_TextIOIT_HDFS #134

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/134/display/redirect?page=changes>


Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #133

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/133/display/redirect?page=changes>

Changes:

[tgroh] Add an Unsupported StateRequestHandler

[apilloud] [SQL] Use the same javacc and fmpp as calcite

[apilloud] [SQL] BeamQueryPlanner uses BeamSqlParserImpl

------------------------------------------
[...truncated 542.81 KB...]
    Caused by: java.net.ConnectException: Call From textioit0writethenreadall-05032306-mpwd-harness-kvb2.c.apache-beam-testing.internal/10.128.0.34 to 58.252.224.35.bc.googleusercontent.com:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
    	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1479)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy65.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy66.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.open(HadoopFileSystem.java:114)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.open(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:263)
    	at org.apache.beam.sdk.io.FileBasedSource$FileBasedReader.startImpl(FileBasedSource.java:451)
    	at org.apache.beam.sdk.io.OffsetBasedSource$OffsetBasedReader.start(OffsetBasedSource.java:261)
    	at org.apache.beam.sdk.io.ReadAllViaFileBasedSource$ReadFileRangesFn.process(ReadAllViaFileBasedSource.java:108)
    Caused by: java.net.ConnectException: Connection refused
    	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
    	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
    	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
    	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
    	at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
    	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1451)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy65.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy66.getFileInfo(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.open(HadoopFileSystem.java:114)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.open(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.open(FileSystems.java:263)
    	at org.apache.beam.sdk.io.FileBasedSource$FileBasedReader.startImpl(FileBasedSource.java:451)
    	at org.apache.beam.sdk.io.OffsetBasedSource$OffsetBasedReader.start(OffsetBasedSource.java:261)
    	at org.apache.beam.sdk.io.ReadAllViaFileBasedSource$ReadFileRangesFn.process(ReadAllViaFileBasedSource.java:108)
    	at org.apache.beam.sdk.io.ReadAllViaFileBasedSource$ReadFileRangesFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    	at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:129)
    	at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    	at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:86)
    	at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:181)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner$1.outputWindowedValue(GroupAlsoByWindowFnRunner.java:101)
    	at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:54)
    	at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:37)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:114)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:72)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:113)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
    	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Workflow failed. Causes: S17:Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Reshuffle/Reshuffle/GroupByKey/Read+Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Reshuffle/Reshuffle/ExpandIterable+Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Reshuffle/Values/Values/Map+Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Read ranges+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
      textioit0writethenreadall-05032306-mpwd-harness-9gw5,
      textioit0writethenreadall-05032306-mpwd-harness-9gw5,
      textioit0writethenreadall-05032306-mpwd-harness-9gw5,
      textioit0writethenreadall-05032306-mpwd-harness-kvb2
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
        at org.apache.beam.sdk.io.text.TextIOIT.writeThenReadAll(TextIOIT.java:114)

1 test completed, 1 failed
Finished generating test XML results (0.048 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/e08d3f61/beam/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.043 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/e08d3f61/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Task worker for ':' Thread 11,5,main]) completed. Took 10 mins 35.194 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 32s
63 actionable tasks: 24 executed, 16 from cache, 23 up-to-date

Publishing build scan...
https://gradle.com/s/hzu6262aqwnhe


STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/e08d3f61/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/e08d3f61/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:583)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:794)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:761)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-04 06:17:00,925 e08d3f61 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-04 06:17:00,926 e08d3f61 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-04 06:17:00,926 e08d3f61 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525410066539> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-04 06:17:01,797 e08d3f61 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525410066539> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-04 06:17:01,984 e08d3f61 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 780, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-04 06:17:01,985 e08d3f61 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-04 06:17:02,041 e08d3f61 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-04 06:17:02,042 e08d3f61 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/e08d3f61/pkb.log>
2018-05-04 06:17:02,042 e08d3f61 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/e08d3f61/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #132

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/132/display/redirect?page=changes>

Changes:

[apilloud] [SQL] Add BeamEnumerableConverter

[mairbek] Templatize host name in SpannerIO

[coheigea] A few small fixes to catch a few regressions - put String literals

[sidhom] Move SyntheticNodes to top-level construction package and rename

[sidhom] Pull Flink execution environment utilities into static methods

[sidhom] Refactor Runner/SDK wire coder creation into a shared utility

[sidhom] [BEAM-3972] Translate portable batch pipelines by proto

[sidhom] FIXUP: Use ModelCoders to create proto coders and use WireCoders to

[sidhom] FIXUP: Use RunnerApi in WireCoders to prepare for coder instantiation

[tgroh] Add additional methods to QueryablePipeline

[tgroh] Expand the ExecutableGraph interface

[tgroh] Add ProtoOverrides

[jasonkuster] Add labels to Go SDK Dataflow Runner.

[jasonkuster] Add verbose error message if unmarshaling fails for label flag.

------------------------------------------
[...truncated 404.28 KB...]
    	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
    	at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
    	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1451)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy65.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy66.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1648)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
    	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:118)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeterministicallyConstructTestTextLineFn.processElement(FileBasedIOITHelper.java:77)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeterministicallyConstructTestTextLineFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
    	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create file/.temp-beam-2018-05-04_00-06-07-0/20507945-07f0-4114-bb69-5e4b7a9d651d. Name node is in safe mode.
    The reported blocks 0 needs additional 31 blocks to reach the threshold 0.9990 of total blocks 31.
    The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1327)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2447)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2335)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:623)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:397)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    	at org.apache.hadoop.ipc.Client.call(Client.java:1475)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy65.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy66.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1648)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
    	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
    Workflow failed. Causes: S02:Generate sequence/Read(BoundedCountingSource)+Produce text lines+Write content to files/WriteFiles/RewindowIntoGlobal/Window.Assign+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write content to files/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
      textioit0writethenreadall-05031706-3t1c-harness-b3nm,
      textioit0writethenreadall-05031706-3t1c-harness-b3nm,
      textioit0writethenreadall-05031706-3t1c-harness-b3nm,
      textioit0writethenreadall-05031706-3t1c-harness-b3nm
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
        at org.apache.beam.sdk.io.text.TextIOIT.writeThenReadAll(TextIOIT.java:114)

1 test completed, 1 failed
Finished generating test XML results (0.032 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/f6f51fd9/beam/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.038 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/f6f51fd9/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Task worker for ':' Thread 14,5,main]) completed. Took 10 mins 58.792 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 12m 16s
63 actionable tasks: 24 executed, 16 from cache, 23 up-to-date

Publishing build scan...
https://gradle.com/s/bqhshu7vi2eus


STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/f6f51fd9/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/f6f51fd9/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:583)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:794)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:761)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-04 00:17:03,609 f6f51fd9 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-04 00:17:03,611 f6f51fd9 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-04 00:17:03,611 f6f51fd9 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525390316474> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-04 00:17:04,485 f6f51fd9 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525390316474> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-04 00:17:04,664 f6f51fd9 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 780, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-04 00:17:04,664 f6f51fd9 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-04 00:17:04,693 f6f51fd9 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-04 00:17:04,693 f6f51fd9 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/f6f51fd9/pkb.log>
2018-05-04 00:17:04,694 f6f51fd9 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/f6f51fd9/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #131

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/131/display/redirect>

------------------------------------------
[...truncated 51.16 KB...]
> Task :beam-sdks-java-io-elasticsearch-tests-common:shadowJar
> Task :beam-sdks-java-io-elasticsearch-tests-common:processTestResources NO-SOURCE
> Task :beam-sdks-java-io-elasticsearch-tests-common:jar
> Task :beam-sdks-java-extensions-sql:copyFmppTemplatesFromSrc
> Task :beam-sdks-java-maven-archetypes-examples:generateSources
> Task :beam-sdks-java-maven-archetypes-examples:processResources
> Task :beam-sdks-java-maven-archetypes-examples:processTestResources
> Task :beam-model-pipeline:extractIncludeProto
> Task :beam-model-pipeline:extractProto

> Task :beam-sdks-python:setupVirtualenv
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/5ac15731/beam/sdks/python/build/gradleenv/bin/python2>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/5ac15731/beam/sdks/python/build/gradleenv/bin/python>
Installing setuptools, pkg_resources, pip, wheel...
> Task :beam-model-pipeline:generateProto

> Task :beam-runners-apex:buildDependencyTree
See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/5ac15731/beam/runners/apex/build/classes/java/main/org/apache/beam/runners/apex/dependency-tree>

> Task :beam-runners-apex:processResources NO-SOURCE
> Task :beam-runners-apex:processTestResources
> Task :beam-sdks-java-extensions-sql:generateFmppSources
> Task :beam-sdks-java-build-tools:compileTestJava FROM-CACHE
> Task :beam-sdks-java-build-tools:processTestResources NO-SOURCE
> Task :beam-sdks-java-build-tools:testClasses UP-TO-DATE
> Task :beam-sdks-java-build-tools:packageTests
> Task :beam-model-pipeline:compileJava FROM-CACHE
> Task :beam-model-pipeline:processResources
> Task :beam-model-pipeline:classes
> Task :beam-sdks-java-build-tools:install

> Task :beam-sdks-python:setupVirtualenv
done.
Running virtualenv with interpreter /usr/bin/python2

> Task :beam-sdks-java-core:compileJava FROM-CACHE

> Task :beam-sdks-java-extensions-sql:compileJavacc
Java Compiler Compiler Version 6.1_2 (Parser Generator)
(type "javacc" with no arguments for help)
Reading from file <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/5ac15731/beam/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj> . . .
Note: UNICODE_INPUT option is specified. Please make sure you create the parser/lexer using a Reader with the correct character encoding.
Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true to force checking.
File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :beam-sdks-java-core:processResources
> Task :beam-sdks-java-core:classes
> Task :beam-model-pipeline:shadowJar
> Task :beam-model-job-management:extractIncludeProto
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-pipeline:jar
> Task :beam-sdks-java-extensions-sql:processResources
> Task :beam-sdks-java-extensions-sql:processTestResources NO-SOURCE
> Task :beam-model-job-management:generateProto
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-pipeline:extractIncludeTestProto
> Task :beam-model-pipeline:extractTestProto
> Task :beam-model-pipeline:generateTestProto NO-SOURCE
> Task :beam-model-pipeline:compileTestJava NO-SOURCE
> Task :beam-model-pipeline:processTestResources NO-SOURCE
> Task :beam-model-pipeline:testClasses UP-TO-DATE
> Task :beam-model-pipeline:packageTests
> Task :beam-model-pipeline:install
> Task :beam-model-job-management:compileJava FROM-CACHE
> Task :beam-model-job-management:classes
> Task :beam-model-fn-execution:compileJava FROM-CACHE
> Task :beam-model-fn-execution:classes

> Task :beam-sdks-python:setupVirtualenv
Requirement already up-to-date: tox==3.0.0 in /home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Collecting grpcio-tools==1.3.5
  Using cached https://files.pythonhosted.org/packages/05/f6/0296e29b1bac6f85d2a8556d48adf825307f73109a3c2c17fb734292db0a/grpcio_tools-1.3.5-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (0.6.0)
Requirement not upgraded as not directly required: six in /usr/local/lib/python2.7/dist-packages (from tox==3.0.0) (1.11.0)
Requirement not upgraded as not directly required: virtualenv>=1.11.2 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (15.2.0)
Requirement not upgraded as not directly required: py>=1.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (1.5.3)
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/9d/61/54c3a9cfde6ffe0ca6a1786ddb8874263f4ca32e7693ad383bd8cf935015/protobuf-3.5.2.post1-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: enum34>=1.0.4 in /usr/local/lib/python2.7/dist-packages (from grpcio>=1.3.5->grpcio-tools==1.3.5) (1.1.6)
Collecting futures>=2.2.0 (from grpcio>=1.3.5->grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement not upgraded as not directly required: setuptools in /usr/local/lib/python2.7/dist-packages (from protobuf>=3.2.0->grpcio-tools==1.3.5) (39.0.1)
Installing collected packages: protobuf, futures, grpcio, grpcio-tools
Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/local/lib/python2.7/dist-packages/protobuf-3.5.2.post1-py2.7-nspkg.pth'
Consider using the `--user` option or check the permissions.


> Task :beam-sdks-python:setupVirtualenv FAILED
> Task :beam-model-job-management:shadowJar
> Task :beam-model-fn-execution:shadowJar
> Task :beam-sdks-java-core:shadowJar

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 52s
132 actionable tasks: 125 executed, 5 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/joxat5fwnatzc


STDERR: 
FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/5ac15731/beam/sdks/python/build.gradle'> line: 36

* What went wrong:
Execution failed for task ':beam-sdks-python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-python:setupVirtualenv'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'command 'sh'' finished with non-zero exit value 1
	at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:389)
	at org.gradle.process.internal.DefaultExecAction.execute(DefaultExecAction.java:36)
	at org.gradle.api.internal.file.DefaultFileOperations.exec(DefaultFileOperations.java:192)
	at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1087)
	at org.gradle.groovy.scripts.DefaultScript.exec(DefaultScript.java:253)
	at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.invokeMethod(BeanDynamicObject.java:479)
	at org.gradle.internal.metaobject.BeanDynamicObject.tryInvokeMethod(BeanDynamicObject.java:191)
	at org.gradle.groovy.scripts.BasicScript$ScriptDynamicObject.tryInvokeMethod(BasicScript.java:130)
	at org.gradle.internal.metaobject.ConfigureDelegate.invokeMethod(ConfigureDelegate.java:78)
	at build_7bhez0sy1fnjhqis3z58se9uy$_run_closure2$_closure16.doCall(<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/5ac15731/beam/sdks/python/build.gradle>:36)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:732)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:705)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-03 18:02:02,872 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-05-03 18:02:03,234 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-05-03 18:02:03,524 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2018-05-03 18:02:03,533 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:02:13,748 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:02:23,943 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:02:34,160 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:02:44,363 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:02:54,505 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:03:04,683 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:03:14,887 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:03:25,119 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:03:35,323 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:03:45,511 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:03:55,695 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:04:05,864 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:04:15,987 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:04:26,122 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:04:36,319 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:04:46,470 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:04:56,647 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:05:06,773 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-03 18:05:06,897 5ac15731 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-03 18:05:06,898 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-03 18:05:06,898 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-03 18:05:08,512 5ac15731 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525361334656> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-03 18:05:08,688 5ac15731 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 780, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-03 18:05:08,689 5ac15731 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-03 18:05:08,742 5ac15731 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-03 18:05:08,743 5ac15731 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/5ac15731/pkb.log>
2018-05-03 18:05:08,743 5ac15731 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/5ac15731/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #130

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/130/display/redirect?page=changes>

Changes:

[echauchot] [BEAM-4138] Support runners that do not support committed metrics in

------------------------------------------
[...truncated 635.43 KB...]
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    	at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:129)
    	at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    	at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:86)
    	at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:181)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner$1.outputWindowedValue(GroupAlsoByWindowFnRunner.java:101)
    	at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:54)
    	at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:37)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:114)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:72)
    	at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:113)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
    	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Workflow failed. Causes: S17:Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Reshuffle/Reshuffle/GroupByKey/Read+Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Reshuffle/Reshuffle/ExpandIterable+Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Reshuffle/Values/Values/Map+Read all files/TextIO.ReadFiles/Read all via FileBasedSource/Read ranges+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
      textioit0writethenreadall-05030506-ymdo-harness-zxxg,
      textioit0writethenreadall-05030506-ymdo-harness-wlg0,
      textioit0writethenreadall-05030506-ymdo-harness-zxxg,
      textioit0writethenreadall-05030506-ymdo-harness-42nk
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
        at org.apache.beam.sdk.io.text.TextIOIT.writeThenReadAll(TextIOIT.java:114)

1 test completed, 1 failed
Finished generating test XML results (0.062 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/66f42c07/beam/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.055 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/66f42c07/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[main,5,main]) completed. Took 10 mins 50.847 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 56s
63 actionable tasks: 24 executed, 16 from cache, 23 up-to-date

Publishing build scan...
https://gradle.com/s/beerefox45smo


STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/66f42c07/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor.process(DefaultTaskPlanExecutor.java:59)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter.execute(DefaultTaskGraphExecuter.java:130)
	at org.gradle.execution.SelectedTaskExecutionAction.execute(SelectedTaskExecutionAction.java:37)
	at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:37)
	at org.gradle.execution.DefaultBuildExecuter.access$000(DefaultBuildExecuter.java:23)
	at org.gradle.execution.DefaultBuildExecuter$1.proceed(DefaultBuildExecuter.java:43)
	at org.gradle.execution.DryRunBuildExecutionAction.execute(DryRunBuildExecutionAction.java:46)
	at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:37)
	at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:30)
	at org.gradle.initialization.DefaultGradleLauncher$ExecuteTasks.run(DefaultGradleLauncher.java:336)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.initialization.DefaultGradleLauncher.runTasks(DefaultGradleLauncher.java:210)
	at org.gradle.initialization.DefaultGradleLauncher.doBuildStages(DefaultGradleLauncher.java:140)
	at org.gradle.initialization.DefaultGradleLauncher.executeTasks(DefaultGradleLauncher.java:115)
	at org.gradle.internal.invocation.GradleBuildController$1.call(GradleBuildController.java:78)
	at org.gradle.internal.invocation.GradleBuildController$1.call(GradleBuildController.java:75)
	at org.gradle.internal.work.DefaultWorkerLeaseService.withLocks(DefaultWorkerLeaseService.java:152)
	at org.gradle.internal.work.StopShieldingWorkerLeaseService.withLocks(StopShieldingWorkerLeaseService.java:38)
	at org.gradle.internal.invocation.GradleBuildController.doBuild(GradleBuildController.java:100)
	at org.gradle.internal.invocation.GradleBuildController.run(GradleBuildController.java:75)
	at org.gradle.tooling.internal.provider.ExecuteBuildActionRunner.run(ExecuteBuildActionRunner.java:28)
	at org.gradle.launcher.exec.ChainingBuildActionRunner.run(ChainingBuildActionRunner.java:35)
	at org.gradle.tooling.internal.provider.ValidatingBuildActionRunner.run(ValidatingBuildActionRunner.java:32)
	at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.run(RunAsBuildOperationBuildActionRunner.java:45)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner.run(RunAsBuildOperationBuildActionRunner.java:42)
	at org.gradle.tooling.internal.provider.SubscribableBuildActionRunner.run(SubscribableBuildActionRunner.java:51)
	at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:47)
	at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:31)
	at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:39)
	at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:25)
	at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:80)
	at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:53)
	at org.gradle.tooling.internal.provider.ServicesSetupBuildActionExecuter.execute(ServicesSetupBuildActionExecuter.java:61)
	at org.gradle.tooling.internal.provider.ServicesSetupBuildActionExecuter.execute(ServicesSetupBuildActionExecuter.java:34)
	at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:36)
	at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:25)
	at org.gradle.tooling.internal.provider.ParallelismConfigurationBuildActionExecuter.execute(ParallelismConfigurationBuildActionExecuter.java:43)
	at org.gradle.tooling.internal.provider.ParallelismConfigurationBuildActionExecuter.execute(ParallelismConfigurationBuildActionExecuter.java:29)
	at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:64)
	at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:29)
	at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:59)
	at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:44)
	at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:46)
	at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:30)
	at org.gradle.launcher.cli.RunBuildAction.run(RunBuildAction.java:52)
	at org.gradle.internal.Actions$RunnableActionAdapter.execute(Actions.java:173)
	at org.gradle.launcher.cli.CommandLineActionFactory$ParseAndBuildAction.execute(CommandLineActionFactory.java:379)
	at org.gradle.launcher.cli.CommandLineActionFactory$ParseAndBuildAction.execute(CommandLineActionFactory.java:352)
	at org.gradle.launcher.cli.ExceptionReportingAction.execute(ExceptionReportingAction.java:37)
	at org.gradle.launcher.cli.ExceptionReportingAction.execute(ExceptionReportingAction.java:23)
	at org.gradle.launcher.cli.CommandLineActionFactory$WithLogging.execute(CommandLineActionFactory.java:345)
	at org.gradle.launcher.cli.CommandLineActionFactory$WithLogging.execute(CommandLineActionFactory.java:275)
	at org.gradle.launcher.Main.doAction(Main.java:36)
	at org.gradle.launcher.bootstrap.EntryPoint.run(EntryPoint.java:45)
	at org.gradle.launcher.bootstrap.ProcessBootstrap.runNoExit(ProcessBootstrap.java:60)
	at org.gradle.launcher.bootstrap.ProcessBootstrap.run(ProcessBootstrap.java:37)
	at org.gradle.launcher.GradleMain.main(GradleMain.java:23)
	at org.gradle.wrapper.BootstrapMainStarter.start(BootstrapMainStarter.java:31)
	at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:108)
	at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:61)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/66f42c07/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:583)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:794)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:761)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 97 more


* Get more help at https://help.gradle.org

2018-05-03 12:17:29,648 66f42c07 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-03 12:17:29,649 66f42c07 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-03 12:17:29,650 66f42c07 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525341661900> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-03 12:17:30,839 66f42c07 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525341661900> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-03 12:17:31,040 66f42c07 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 780, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-03 12:17:31,041 66f42c07 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-03 12:17:31,074 66f42c07 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-03 12:17:31,075 66f42c07 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/66f42c07/pkb.log>
2018-05-03 12:17:31,076 66f42c07 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/66f42c07/completion_statuses.json>
Build step 'Execute shell' marked build as failure