You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/05/09 06:05:08 UTC

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #153

See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/153/display/redirect?page=changes>

Changes:

[Pablo] Make experiments as set attr of RuntimeValueProvider

------------------------------------------
[...truncated 52.99 KB...]
> Task :beam-sdks-java-extensions-sql:compileJavacc
Java Compiler Compiler Version 4.0 (Parser Generator)
(type "javacc" with no arguments for help)
Warning: Bad option "-grammar_encoding=UTF-8" will be ignored.
Reading from file <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/73676819/beam/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj> . . .
Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true to force checking.
File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :beam-sdks-java-core:compileJava FROM-CACHE
> Task :beam-sdks-java-core:processResources
> Task :beam-sdks-java-core:classes

> Task :beam-sdks-python:setupVirtualenv
done.
Running virtualenv with interpreter /usr/bin/python2
Requirement already up-to-date: tox==3.0.0 in /home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Collecting grpcio-tools==1.3.5

> Task :beam-sdks-java-extensions-sql:processResources
> Task :beam-sdks-java-extensions-sql:processTestResources NO-SOURCE
> Task :beam-model-pipeline:shadowJar
> Task :beam-model-job-management:extractIncludeProto
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-pipeline:jar
> Task :beam-model-job-management:generateProto
> Task :beam-model-pipeline:extractIncludeTestProto
> Task :beam-model-pipeline:extractTestProto
> Task :beam-model-pipeline:generateTestProto NO-SOURCE
> Task :beam-model-pipeline:compileTestJava NO-SOURCE
> Task :beam-model-pipeline:processTestResources NO-SOURCE
> Task :beam-model-pipeline:testClasses UP-TO-DATE
> Task :beam-model-pipeline:packageTests
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-pipeline:install
> Task :beam-model-job-management:compileJava FROM-CACHE
> Task :beam-model-job-management:classes
> Task :beam-model-fn-execution:compileJava FROM-CACHE
> Task :beam-model-fn-execution:classes

> Task :beam-sdks-python:setupVirtualenv
  Using cached https://files.pythonhosted.org/packages/05/f6/0296e29b1bac6f85d2a8556d48adf825307f73109a3c2c17fb734292db0a/grpcio_tools-1.3.5-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (0.6.0)
Requirement not upgraded as not directly required: six in /usr/local/lib/python2.7/dist-packages (from tox==3.0.0) (1.11.0)
Requirement not upgraded as not directly required: virtualenv>=1.11.2 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (15.2.0)
Requirement not upgraded as not directly required: py>=1.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (1.5.3)
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/9d/61/54c3a9cfde6ffe0ca6a1786ddb8874263f4ca32e7693ad383bd8cf935015/protobuf-3.5.2.post1-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: enum34>=1.0.4 in /usr/local/lib/python2.7/dist-packages (from grpcio>=1.3.5->grpcio-tools==1.3.5) (1.1.6)
Collecting futures>=2.2.0 (from grpcio>=1.3.5->grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement not upgraded as not directly required: setuptools in /usr/local/lib/python2.7/dist-packages (from protobuf>=3.2.0->grpcio-tools==1.3.5) (39.0.1)
Installing collected packages: protobuf, futures, grpcio, grpcio-tools
Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/local/lib/python2.7/dist-packages/protobuf-3.5.2.post1-py2.7-nspkg.pth'
Consider using the `--user` option or check the permissions.


> Task :beam-sdks-python:setupVirtualenv FAILED
> Task :beam-model-job-management:shadowJar
> Task :beam-model-fn-execution:shadowJar
> Task :beam-sdks-java-core:shadowJar

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 0s
132 actionable tasks: 125 executed, 5 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xv5y4nvkqh5rw


STDERR: 
FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/73676819/beam/sdks/python/build.gradle'> line: 36

* What went wrong:
Execution failed for task ':beam-sdks-python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-python:setupVirtualenv'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'command 'sh'' finished with non-zero exit value 1
	at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:389)
	at org.gradle.process.internal.DefaultExecAction.execute(DefaultExecAction.java:36)
	at org.gradle.api.internal.file.DefaultFileOperations.exec(DefaultFileOperations.java:192)
	at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1087)
	at org.gradle.groovy.scripts.DefaultScript.exec(DefaultScript.java:253)
	at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.invokeMethod(BeanDynamicObject.java:479)
	at org.gradle.internal.metaobject.BeanDynamicObject.tryInvokeMethod(BeanDynamicObject.java:191)
	at org.gradle.groovy.scripts.BasicScript$ScriptDynamicObject.tryInvokeMethod(BasicScript.java:130)
	at org.gradle.internal.metaobject.ConfigureDelegate.invokeMethod(ConfigureDelegate.java:78)
	at build_e3ml1laslmwyn59g9eq7a6ifk$_run_closure2$_closure16.doCall(<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/73676819/beam/sdks/python/build.gradle>:36)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:732)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:705)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-09 06:02:02,925 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-05-09 06:02:03,245 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-05-09 06:02:03,491 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2018-05-09 06:02:03,497 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:02:13,708 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:02:23,924 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:02:34,136 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:02:44,304 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:02:54,489 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:03:04,677 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:03:14,874 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:03:25,035 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:03:35,224 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:03:45,444 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:03:55,648 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:04:05,850 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:04:16,004 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:04:26,136 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:04:36,740 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:04:46,907 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:04:57,055 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:05:07,224 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 06:05:07,361 73676819 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 647, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-09 06:05:07,362 73676819 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-09 06:05:07,362 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-09 06:05:07,776 73676819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525842068084> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-09 06:05:07,932 73676819 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 781, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 647, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-09 06:05:07,933 73676819 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-09 06:05:07,961 73676819 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-09 06:05:07,961 73676819 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/73676819/pkb.log>
2018-05-09 06:05:07,962 73676819 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/73676819/completion_statuses.json>
Build step 'Execute shell' marked build as failure
[Set GitHub commit status (universal)] ERROR on repos [GHRepository@278251d3[description=Apache Beam,homepage=,name=beam,fork=false,size=58754,milestones={},language=Java,commits={},source=<null>,parent=<null>,responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Wed, 09 May 2018 06:05:08 GMT], ETag=[W/"b325fcee1bcdcb6673ffb3a642ac3060"], Last-Modified=[Wed, 09 May 2018 00:26:13 GMT], OkHttp-Received-Millis=[1525845908694], OkHttp-Response-Source=[NETWORK 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1525845908548], Referrer-Policy=[origin-when-cross-origin, strict-origin-when-cross-origin], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[DE20:4F40:9893BA:14F4C29:5AF28F89], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4971], X-RateLimit-Reset=[1525848688], X-Runtime-rack=[0.063063], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/apache/beam,id=50904245]] (sha:60f90c8) with context:beam_PerformanceTests_TextIOIT_HDFS
Setting commit status on GitHub for https://github.com/apache/beam/commit/60f90c8dcb229c35a82c7be15e64a89678bae058
ERROR: Build step failed with exception
java.io.FileNotFoundException: https://api.github.com/repos/apache/beam/statuses/60f90c8dcb229c35a82c7be15e64a89678bae058
	at com.squareup.okhttp.internal.huc.HttpURLConnectionImpl.getInputStream(HttpURLConnectionImpl.java:243)
	at com.squareup.okhttp.internal.huc.DelegatingHttpsURLConnection.getInputStream(DelegatingHttpsURLConnection.java:210)
	at com.squareup.okhttp.internal.huc.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:25)
	at org.kohsuke.github.Requester.parse(Requester.java:612)
	at org.kohsuke.github.Requester.parse(Requester.java:594)
	at org.kohsuke.github.Requester._to(Requester.java:272)
Caused: org.kohsuke.github.GHFileNotFoundException: {"message":"Not Found","documentation_url":"https://developer.github.com/v3/repos/statuses/#create-a-status"}
	at org.kohsuke.github.Requester.handleApiError(Requester.java:686)
	at org.kohsuke.github.Requester._to(Requester.java:293)
	at org.kohsuke.github.Requester.to(Requester.java:234)
	at org.kohsuke.github.GHRepository.createCommitStatus(GHRepository.java:1075)
	at org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:160)
Caused: org.jenkinsci.plugins.github.common.CombineErrorHandler$ErrorHandlingException
	at org.jenkinsci.plugins.github.common.CombineErrorHandler.handle(CombineErrorHandler.java:74)
	at org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:164)
	at com.cloudbees.jenkins.GitHubCommitNotifier.perform(GitHubCommitNotifier.java:151)
	at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:81)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:744)
	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:690)
	at hudson.model.Build$BuildExecution.post2(Build.java:186)
	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:635)
	at hudson.model.Run.execute(Run.java:1749)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Build step 'Set build status on GitHub commit [deprecated]' marked build as failure

Jenkins build is back to normal : beam_PerformanceTests_TextIOIT_HDFS #156

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/156/display/redirect?page=changes>


Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #155

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/155/display/redirect?page=changes>

Changes:

[kedin] [SQL] Change DDL parser to create Beam Schema types instead of Calcite

[kedin] [SQL] Support arrays of primitive types in DDL

[kedin] [SQL] Parse Row fields

[kedin] [SQL] Remove Column. Replace with Schema

[kedin] [SQL] Validate Table schemas in BeamSqlCliTest

[kedin] [SQL] Add support for angular brackets array definition

[kedin] [SQL] Add support for Maps in DDL

[kedin] [SQL] Add support for angular brackets Row syntax

[kedin] [SQL] Fix rebase conflicts

[yifanzou] update machine numbers to 1..16 for inventory tests

[kedin] [SQL] Add QuickCheck and tests for DDL schema verification

[kedin] [SQL] Remove support for postfix array declaration in DDL

[Pablo] StateSampler knows the execution thread it tracks.

[github] Revert "Enable githubCommitNotifier for post commits"

------------------------------------------
[...truncated 433.77 KB...]
    	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
    	at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
    	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1451)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy65.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy66.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1648)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
    	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:118)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:271)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
    	at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeterministicallyConstructTestTextLineFn.processElement(FileBasedIOITHelper.java:77)
    	at org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeterministicallyConstructTestTextLineFn$DoFnInvoker.invokeProcessElement(Unknown Source)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
    	at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
    	at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:323)
    	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
    	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
    	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
    	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
    	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
    	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create file/.temp-beam-2018-05-09_19-14-32-0/f945cfb7-3a92-479a-96b3-a1a3ea7ed7ad. Name node is in safe mode.
    The reported blocks 0 needs additional 31 blocks to reach the threshold 0.9990 of total blocks 31.
    The number of live datanodes 0 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1327)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2447)
    	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2335)
    	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:623)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:397)
    	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
    	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at javax.security.auth.Subject.doAs(Subject.java:415)
    	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    	at org.apache.hadoop.ipc.Client.call(Client.java:1475)
    	at org.apache.hadoop.ipc.Client.call(Client.java:1412)
    	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    	at com.sun.proxy.$Proxy65.create(Unknown Source)
    	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
    	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    	at java.lang.reflect.Method.invoke(Method.java:498)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    	at com.sun.proxy.$Proxy66.create(Unknown Source)
    	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1648)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
    	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
    	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
    	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:459)
    	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:789)
    	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:778)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:109)
    	at org.apache.beam.sdk.io.hdfs.HadoopFileSystem.create(HadoopFileSystem.java:68)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:249)
    	at org.apache.beam.sdk.io.FileSystems.create(FileSystems.java:236)
    	at org.apache.beam.sdk.io.FileBasedSink$Writer.open(FileBasedSink.java:923)
    	at org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesWithSpillingFn.processElement(WriteFiles.java:503)
    Workflow failed. Causes: S02:Generate sequence/Read(BoundedCountingSource)+Produce text lines+Write content to files/WriteFiles/RewindowIntoGlobal/Window.Assign+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write content to files/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write content to files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
      textioit0writethenreadall-05091214-5aau-harness-0szz,
      textioit0writethenreadall-05091214-5aau-harness-0szz,
      textioit0writethenreadall-05091214-5aau-harness-0szz,
      textioit0writethenreadall-05091214-5aau-harness-0szz
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
        at org.apache.beam.sdk.io.text.TextIOIT.writeThenReadAll(TextIOIT.java:114)

1 test completed, 1 failed
Finished generating test XML results (0.027 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/4a1d4616/beam/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.035 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/4a1d4616/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Task worker for ':' Thread 7,5,main]) completed. Took 11 mins 53.203 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 12m 50s
63 actionable tasks: 26 executed, 14 from cache, 23 up-to-date

Publishing build scan...
https://gradle.com/s/6hedxcxn4lteg


STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/4a1d4616/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/4a1d4616/beam/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:583)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:794)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:761)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-09 19:26:22,543 4a1d4616 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 647, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-09 19:26:22,545 4a1d4616 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-09 19:26:22,545 4a1d4616 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525892572696> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-09 19:26:23,106 4a1d4616 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525892572696> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-09 19:26:23,345 4a1d4616 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 781, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 647, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-09 19:26:23,346 4a1d4616 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-09 19:26:23,390 4a1d4616 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-09 19:26:23,391 4a1d4616 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/4a1d4616/pkb.log>
2018-05-09 19:26:23,391 4a1d4616 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/4a1d4616/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #154

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/154/display/redirect>

------------------------------------------
[...truncated 53.16 KB...]
> Task :beam-sdks-java-core:compileJava FROM-CACHE
> Task :beam-sdks-java-core:processResources
> Task :beam-sdks-java-core:classes

> Task :beam-sdks-java-extensions-sql:compileJavacc
Java Compiler Compiler Version 4.0 (Parser Generator)
(type "javacc" with no arguments for help)
Warning: Bad option "-grammar_encoding=UTF-8" will be ignored.
Reading from file <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/317acfdf/beam/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj> . . .
Note: UNICODE_INPUT option is specified. Please make sure you create the parser/lexer using a Reader with the correct character encoding.
Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true to force checking.

> Task :beam-model-pipeline:shadowJar
> Task :beam-model-job-management:extractIncludeProto

> Task :beam-sdks-python:setupVirtualenv
  Using cached https://files.pythonhosted.org/packages/05/f6/0296e29b1bac6f85d2a8556d48adf825307f73109a3c2c17fb734292db0a/grpcio_tools-1.3.5-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (0.6.0)
Requirement not upgraded as not directly required: six in /usr/local/lib/python2.7/dist-packages (from tox==3.0.0) (1.11.0)
Requirement not upgraded as not directly required: virtualenv>=1.11.2 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (15.2.0)
Requirement not upgraded as not directly required: py>=1.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (1.5.3)
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl

> Task :beam-model-job-management:generateProto
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-pipeline:jar
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-pipeline:extractIncludeTestProto
> Task :beam-model-pipeline:extractTestProto
> Task :beam-model-pipeline:generateTestProto NO-SOURCE
> Task :beam-model-pipeline:compileTestJava NO-SOURCE
> Task :beam-model-pipeline:processTestResources NO-SOURCE
> Task :beam-model-pipeline:testClasses UP-TO-DATE
> Task :beam-model-pipeline:packageTests
> Task :beam-model-job-management:compileJava FROM-CACHE
> Task :beam-model-job-management:classes
> Task :beam-model-fn-execution:compileJava FROM-CACHE
> Task :beam-model-fn-execution:classes
> Task :beam-model-pipeline:install

> Task :beam-sdks-java-extensions-sql:compileJavacc
File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :beam-sdks-java-extensions-sql:processResources
> Task :beam-sdks-java-extensions-sql:processTestResources NO-SOURCE

> Task :beam-sdks-python:setupVirtualenv
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/9d/61/54c3a9cfde6ffe0ca6a1786ddb8874263f4ca32e7693ad383bd8cf935015/protobuf-3.5.2.post1-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: enum34>=1.0.4 in /usr/local/lib/python2.7/dist-packages (from grpcio>=1.3.5->grpcio-tools==1.3.5) (1.1.6)
Collecting futures>=2.2.0 (from grpcio>=1.3.5->grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement not upgraded as not directly required: setuptools in /usr/local/lib/python2.7/dist-packages (from protobuf>=3.2.0->grpcio-tools==1.3.5) (39.0.1)
Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/local/lib/python2.7/dist-packages/protobuf-3.5.2.post1-py2.7-nspkg.pth'
Consider using the `--user` option or check the permissions.


> Task :beam-sdks-python:setupVirtualenv FAILED
> Task :beam-model-job-management:shadowJar
> Task :beam-model-fn-execution:shadowJar
> Task :beam-sdks-java-core:shadowJar

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58s
132 actionable tasks: 125 executed, 5 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/55m5g74ds5dzi


STDERR: 
FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/317acfdf/beam/sdks/python/build.gradle'> line: 36

* What went wrong:
Execution failed for task ':beam-sdks-python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-python:setupVirtualenv'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'command 'sh'' finished with non-zero exit value 1
	at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:389)
	at org.gradle.process.internal.DefaultExecAction.execute(DefaultExecAction.java:36)
	at org.gradle.api.internal.file.DefaultFileOperations.exec(DefaultFileOperations.java:192)
	at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1087)
	at org.gradle.groovy.scripts.DefaultScript.exec(DefaultScript.java:253)
	at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.invokeMethod(BeanDynamicObject.java:479)
	at org.gradle.internal.metaobject.BeanDynamicObject.tryInvokeMethod(BeanDynamicObject.java:191)
	at org.gradle.groovy.scripts.BasicScript$ScriptDynamicObject.tryInvokeMethod(BasicScript.java:130)
	at org.gradle.internal.metaobject.ConfigureDelegate.invokeMethod(ConfigureDelegate.java:78)
	at build_cdagcaic4bij6iogsu372syur$_run_closure2$_closure16.doCall(<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/317acfdf/beam/sdks/python/build.gradle>:36)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:732)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:705)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-09 12:01:53,302 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-05-09 12:01:53,801 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-05-09 12:01:54,159 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2018-05-09 12:01:54,169 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:02:04,476 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:02:14,700 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:02:24,916 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:02:35,131 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:02:45,323 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:02:55,536 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:03:05,775 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:03:15,913 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:03:26,167 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:03:36,376 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:03:46,607 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:03:56,808 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:04:07,000 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:04:17,197 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:04:27,392 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:04:37,592 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:04:47,798 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:04:58,041 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-09 12:04:58,233 317acfdf MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 647, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-09 12:04:58,235 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-09 12:04:58,236 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-09 12:04:59,276 317acfdf MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1525860061094> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-09 12:04:59,530 317acfdf MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 781, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 647, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-09 12:04:59,531 317acfdf MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-09 12:04:59,574 317acfdf MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-09 12:04:59,575 317acfdf MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/317acfdf/pkb.log>
2018-05-09 12:04:59,576 317acfdf MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/317acfdf/completion_statuses.json>
Build step 'Execute shell' marked build as failure
[Set GitHub commit status (universal)] ERROR on repos [GHRepository@175b92de[description=Apache Beam,homepage=,name=beam,fork=false,size=58754,milestones={},language=Java,commits={},source=<null>,parent=<null>,responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Wed, 09 May 2018 12:05:00 GMT], ETag=[W/"c9c57ab721dc1f578ccd914bb574f9f9"], Last-Modified=[Wed, 09 May 2018 10:15:34 GMT], OkHttp-Received-Millis=[1525867500379], OkHttp-Response-Source=[NETWORK 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1525867500248], Referrer-Policy=[origin-when-cross-origin, strict-origin-when-cross-origin], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[9488:4F3F:7C1B94:126435C:5AF2E3CB], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4949], X-RateLimit-Reset=[1525867519], X-Runtime-rack=[0.049678], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/apache/beam,id=50904245]] (sha:60f90c8) with context:beam_PerformanceTests_TextIOIT_HDFS
Setting commit status on GitHub for https://github.com/apache/beam/commit/60f90c8dcb229c35a82c7be15e64a89678bae058
ERROR: Build step failed with exception
java.io.FileNotFoundException: https://api.github.com/repos/apache/beam/statuses/60f90c8dcb229c35a82c7be15e64a89678bae058
	at com.squareup.okhttp.internal.huc.HttpURLConnectionImpl.getInputStream(HttpURLConnectionImpl.java:243)
	at com.squareup.okhttp.internal.huc.DelegatingHttpsURLConnection.getInputStream(DelegatingHttpsURLConnection.java:210)
	at com.squareup.okhttp.internal.huc.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:25)
	at org.kohsuke.github.Requester.parse(Requester.java:612)
	at org.kohsuke.github.Requester.parse(Requester.java:594)
	at org.kohsuke.github.Requester._to(Requester.java:272)
Caused: org.kohsuke.github.GHFileNotFoundException: {"message":"Not Found","documentation_url":"https://developer.github.com/v3/repos/statuses/#create-a-status"}
	at org.kohsuke.github.Requester.handleApiError(Requester.java:686)
	at org.kohsuke.github.Requester._to(Requester.java:293)
	at org.kohsuke.github.Requester.to(Requester.java:234)
	at org.kohsuke.github.GHRepository.createCommitStatus(GHRepository.java:1075)
	at org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:160)
Caused: org.jenkinsci.plugins.github.common.CombineErrorHandler$ErrorHandlingException
	at org.jenkinsci.plugins.github.common.CombineErrorHandler.handle(CombineErrorHandler.java:74)
	at org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:164)
	at com.cloudbees.jenkins.GitHubCommitNotifier.perform(GitHubCommitNotifier.java:151)
	at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:81)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:744)
	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:690)
	at hudson.model.Build$BuildExecution.post2(Build.java:186)
	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:635)
	at hudson.model.Run.execute(Run.java:1749)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Build step 'Set build status on GitHub commit [deprecated]' marked build as failure