You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/05/02 12:04:39 UTC

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #119

See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/119/display/redirect>

------------------------------------------
[...truncated 52.05 KB...]
> Task :beam-runners-apex:processTestResources
> Task :beam-sdks-java-build-tools:compileTestJava FROM-CACHE
> Task :beam-sdks-java-build-tools:processTestResources NO-SOURCE
> Task :beam-sdks-java-build-tools:testClasses UP-TO-DATE
> Task :beam-sdks-java-build-tools:packageTests
> Task :beam-sdks-java-extensions-sql:generateFmppSources
> Task :beam-model-pipeline:compileJava FROM-CACHE
> Task :beam-model-pipeline:processResources
> Task :beam-model-pipeline:classes
> Task :beam-sdks-java-build-tools:install

> Task :beam-sdks-python:setupVirtualenv
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/a43f03f5/beam/sdks/python/build/gradleenv/bin/python2>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/a43f03f5/beam/sdks/python/build/gradleenv/bin/python>
Installing setuptools, pkg_resources, pip, wheel...
> Task :beam-sdks-java-core:compileJava FROM-CACHE
> Task :beam-sdks-java-core:processResources
> Task :beam-sdks-java-core:classes
> Task :beam-model-pipeline:shadowJar

> Task :beam-sdks-java-extensions-sql:compileJavacc
Java Compiler Compiler Version 6.1_2 (Parser Generator)
(type "javacc" with no arguments for help)
Reading from file <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/a43f03f5/beam/sdks/java/extensions/sql/build/generated/fmpp/javacc/Parser.jj> . . .
Note: UNICODE_INPUT option is specified. Please make sure you create the parser/lexer using a Reader with the correct character encoding.
Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true to force checking.
File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :beam-sdks-java-extensions-sql:processResources
> Task :beam-sdks-java-extensions-sql:processTestResources NO-SOURCE
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-job-management:extractIncludeProto
> Task :beam-model-pipeline:jar
> Task :beam-model-job-management:generateProto
> Task :beam-model-pipeline:extractIncludeTestProto
> Task :beam-model-pipeline:extractTestProto
> Task :beam-model-pipeline:generateTestProto NO-SOURCE
> Task :beam-model-pipeline:compileTestJava NO-SOURCE
> Task :beam-model-pipeline:processTestResources NO-SOURCE
> Task :beam-model-pipeline:testClasses UP-TO-DATE
> Task :beam-model-pipeline:packageTests
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-pipeline:install
> Task :beam-model-job-management:compileJava FROM-CACHE
> Task :beam-model-job-management:classes
> Task :beam-model-fn-execution:compileJava FROM-CACHE
> Task :beam-model-fn-execution:classes

> Task :beam-sdks-python:setupVirtualenv
done.
Running virtualenv with interpreter /usr/bin/python2
Collecting tox==3.0.0
  Using cached https://files.pythonhosted.org/packages/e6/41/4dcfd713282bf3213b0384320fa8841e4db032ddcb80bc08a540159d42a8/tox-3.0.0-py2.py3-none-any.whl
Collecting grpcio-tools==1.3.5
  Using cached https://files.pythonhosted.org/packages/05/f6/0296e29b1bac6f85d2a8556d48adf825307f73109a3c2c17fb734292db0a/grpcio_tools-1.3.5-cp27-cp27mu-manylinux1_x86_64.whl
Collecting pluggy<1.0,>=0.3.0 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/82/05/43e3947125a2137cba4746135c75934ceed1863f27e050fc560052104a71/pluggy-0.6.0-py2-none-any.whl
Requirement not upgraded as not directly required: six in /usr/local/lib/python2.7/dist-packages (from tox==3.0.0) (1.11.0)
Requirement not upgraded as not directly required: virtualenv>=1.11.2 in /usr/lib/python2.7/dist-packages (from tox==3.0.0) (15.0.1)
Collecting py>=1.4.17 (from tox==3.0.0)
  Using cached https://files.pythonhosted.org/packages/67/a5/f77982214dd4c8fd104b066f249adea2c49e25e8703d284382eb5e9ab35a/py-1.5.3-py2.py3-none-any.whl
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/9d/61/54c3a9cfde6ffe0ca6a1786ddb8874263f4ca32e7693ad383bd8cf935015/protobuf-3.5.2.post1-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: enum34>=1.0.4 in /usr/local/lib/python2.7/dist-packages (from grpcio>=1.3.5->grpcio-tools==1.3.5) (1.1.6)
Collecting futures>=2.2.0 (from grpcio>=1.3.5->grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement not upgraded as not directly required: setuptools in /usr/local/lib/python2.7/dist-packages (from protobuf>=3.2.0->grpcio-tools==1.3.5) (39.0.1)
Installing collected packages: pluggy, py, tox, protobuf, futures, grpcio, grpcio-tools
Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/local/lib/python2.7/dist-packages/pluggy-0.6.0.dist-info'
Consider using the `--user` option or check the permissions.


> Task :beam-model-job-management:shadowJar
> Task :beam-model-job-management:jar
> Task :beam-model-fn-execution:shadowJar
> Task :beam-model-job-management:extractIncludeTestProto
> Task :beam-model-job-management:generateTestProto NO-SOURCE
> Task :beam-model-job-management:compileTestJava NO-SOURCE
> Task :beam-model-job-management:testClasses UP-TO-DATE
> Task :beam-model-job-management:packageTests
> Task :beam-model-job-management:install
> Task :beam-model-fn-execution:jar
> Task :beam-model-fn-execution:extractIncludeTestProto
> Task :beam-model-fn-execution:generateTestProto NO-SOURCE
> Task :beam-model-fn-execution:compileTestJava NO-SOURCE
> Task :beam-model-fn-execution:testClasses UP-TO-DATE
> Task :beam-model-fn-execution:packageTests
> Task :beam-model-fn-execution:install
> Task :beam-sdks-python:setupVirtualenv FAILED
> Task :beam-model-fn-execution:shadowTestJar
> Task :beam-sdks-java-core:shadowJar

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 35s
141 actionable tasks: 134 executed, 5 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/5xjf4xdlit3q4


STDERR: 
FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/a43f03f5/beam/sdks/python/build.gradle'> line: 36

* What went wrong:
Execution failed for task ':beam-sdks-python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-python:setupVirtualenv'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'command 'sh'' finished with non-zero exit value 1
	at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:389)
	at org.gradle.process.internal.DefaultExecAction.execute(DefaultExecAction.java:36)
	at org.gradle.api.internal.file.DefaultFileOperations.exec(DefaultFileOperations.java:192)
	at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1087)
	at org.gradle.groovy.scripts.DefaultScript.exec(DefaultScript.java:253)
	at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.invokeMethod(BeanDynamicObject.java:479)
	at org.gradle.internal.metaobject.BeanDynamicObject.tryInvokeMethod(BeanDynamicObject.java:191)
	at org.gradle.groovy.scripts.BasicScript$ScriptDynamicObject.tryInvokeMethod(BasicScript.java:130)
	at org.gradle.internal.metaobject.ConfigureDelegate.invokeMethod(ConfigureDelegate.java:78)
	at build_39d8pnqz2kboavy5tw1h1qzta$_run_closure2$_closure16.doCall(<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/a43f03f5/beam/sdks/python/build.gradle>:36)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:732)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:705)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-02 12:01:34,491 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> create -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-05-02 12:01:34,959 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> create -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-05-02 12:01:35,146 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2018-05-02 12:01:35,151 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:01:45,304 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:01:55,445 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:02:05,577 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:02:15,710 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:02:25,845 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:02:36,012 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:02:46,145 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:02:56,279 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:03:06,416 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:03:16,553 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:03:26,684 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:03:36,825 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:03:46,960 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:03:57,096 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:04:07,231 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:04:17,369 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:04:27,539 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:04:37,711 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 12:04:37,841 a43f03f5 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-02 12:04:37,842 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-02 12:04:37,842 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> delete -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-02 12:04:38,290 a43f03f5 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525255264176> delete -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-02 12:04:38,478 a43f03f5 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 780, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-02 12:04:38,478 a43f03f5 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-02 12:04:38,524 a43f03f5 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-02 12:04:38,525 a43f03f5 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/a43f03f5/pkb.log>
2018-05-02 12:04:38,525 a43f03f5 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/a43f03f5/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PerformanceTests_XmlIOIT_HDFS #121

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/121/display/redirect?page=changes>


Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #120

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/120/display/redirect?page=changes>

Changes:

[github] Count compressed records with a long to avoid overflow

[alan] [BEAM-4218] Fix failing javadoc build

[github] Fix DynamicDestinations documentation

------------------------------------------
[...truncated 91.68 KB...]
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/reflect/DoFnInvoker.java>:86: warning: [TypeParameterUnusedInFormals] Declaring a type parameter that is only used in the return type is a misuse of generics: operations on the type parameter are unchecked, it hides unsafe casts at invocations of the method, and it interacts badly with method overload resolution.
  <RestrictionT, TrackerT extends RestrictionTracker<RestrictionT, ?>> TrackerT invokeNewTracker(
                                                                                ^
    (see http://errorprone.info/bugpattern/TypeParameterUnusedInFormals)
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/reflect/DoFnInvoker.java>:285: warning: [MissingOverride] restrictionTracker implements method in ArgumentProvider; expected @Override
    public RestrictionTracker<?, ?> restrictionTracker() {
                                    ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public RestrictionTracker<?, ?> restrictionTracker() {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:428: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<String> expand(PCollection<String> in) {
                               ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<String> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:470: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<String> expand(PCollection<String> in) {
                               ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<String> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:512: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<List<String>> expand(PCollection<String> in) {
                                     ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<List<String>> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:564: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<KV<String, String>> expand(PCollection<String> in) {
                                           ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<KV<String, String>> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:610: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<KV<String, String>> expand(PCollection<String> in) {
                                           ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<KV<String, String>> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:652: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<String> expand(PCollection<String> in) {
                               ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<String> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:694: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<String> expand(PCollection<String> in) {
                               ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<String> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:735: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<List<String>> expand(PCollection<String> in) {
                                     ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<List<String>> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:788: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<KV<String, String>> expand(PCollection<String> in) {
                                           ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<KV<String, String>> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:835: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<KV<String, String>> expand(PCollection<String> in) {
                                           ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<KV<String, String>> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:877: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<String> expand(PCollection<String> in) {
                               ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<String> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:916: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<String> expand(PCollection<String> in) {
                               ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<String> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/transforms/Regex.java>:957: warning: [MissingOverride] expand implements method in PTransform; expected @Override
    public PCollection<String> expand(PCollection<String> in) {
                               ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public PCollection<String> expand(PCollection<String> in) {'?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/util/RowJsonDeserializer.java>:76: warning: [MutableConstantField] Constant field declarations should use the immutable type (such as ImmutableList) instead of the general collection interface type (such as List)
  private static final Map<TypeName, ValueExtractor<?>> JSON_VALUE_GETTERS =
                          ^
    (see http://errorprone.info/bugpattern/MutableConstantField)
  Did you mean 'private static final ImmutableMap<TypeName, ValueExtractor<?>> JSON_VALUE_GETTERS ='?
<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/java/core/src/main/java/org/apache/beam/sdk/util/WeightedValue.java>:39: warning: [MissingOverride] getWeight implements method in Weighted; expected @Override
  public long getWeight() {
              ^
    (see http://errorprone.info/bugpattern/MissingOverride)
  Did you mean '@Override public long getWeight() {'?
/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.7.10/byte-buddy-1.7.10.jar(/net/bytebuddy/description/type/TypeDescription$AbstractBase.class): warning: Cannot find annotation method 'value()' in type 'SuppressFBWarnings'
/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.7.10/byte-buddy-1.7.10.jar(/net/bytebuddy/description/type/TypeDescription$AbstractBase.class): warning: Cannot find annotation method 'justification()' in type 'SuppressFBWarnings'
/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.7.10/byte-buddy-1.7.10.jar(/net/bytebuddy/dynamic/scaffold/MethodGraph$Compiler.class): warning: Cannot find annotation method 'value()' in type 'SuppressFBWarnings'
/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.7.10/byte-buddy-1.7.10.jar(/net/bytebuddy/dynamic/scaffold/MethodGraph$Compiler.class): warning: Cannot find annotation method 'justification()' in type 'SuppressFBWarnings'
/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.7.10/byte-buddy-1.7.10.jar(/net/bytebuddy/implementation/auxiliary/AuxiliaryType.class): warning: Cannot find annotation method 'value()' in type 'SuppressFBWarnings'
/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.7.10/byte-buddy-1.7.10.jar(/net/bytebuddy/implementation/auxiliary/AuxiliaryType.class): warning: Cannot find annotation method 'justification()' in type 'SuppressFBWarnings'
/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.7.10/byte-buddy-1.7.10.jar(/net/bytebuddy/ClassFileVersion.class): warning: Cannot find annotation method 'value()' in type 'SuppressFBWarnings'
/home/jenkins/.m2/repository/net/bytebuddy/byte-buddy/1.7.10/byte-buddy-1.7.10.jar(/net/bytebuddy/ClassFileVersion.class): warning: Cannot find annotation method 'justification()' in type 'SuppressFBWarnings'
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
100 warnings

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 49s
136 actionable tasks: 130 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/wameg6nqya2gw


STDERR: 
FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/python/build.gradle'> line: 36

* What went wrong:
Execution failed for task ':beam-sdks-python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-python:setupVirtualenv'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'command 'sh'' finished with non-zero exit value 1
	at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:389)
	at org.gradle.process.internal.DefaultExecAction.execute(DefaultExecAction.java:36)
	at org.gradle.api.internal.file.DefaultFileOperations.exec(DefaultFileOperations.java:192)
	at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1087)
	at org.gradle.groovy.scripts.DefaultScript.exec(DefaultScript.java:253)
	at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.invokeMethod(BeanDynamicObject.java:479)
	at org.gradle.internal.metaobject.BeanDynamicObject.tryInvokeMethod(BeanDynamicObject.java:191)
	at org.gradle.groovy.scripts.BasicScript$ScriptDynamicObject.tryInvokeMethod(BasicScript.java:130)
	at org.gradle.internal.metaobject.ConfigureDelegate.invokeMethod(ConfigureDelegate.java:78)
	at build_btkqyyw82lcte7g45lyb6kkb4$_run_closure2$_closure16.doCall(<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/beam/sdks/python/build.gradle>:36)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:732)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:705)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-02 18:02:06,494 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> create -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-05-02 18:02:06,803 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> create -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-05-02 18:02:07,021 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2018-05-02 18:02:07,028 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:02:17,195 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:02:27,347 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:02:37,493 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:02:47,686 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:02:57,868 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:03:08,015 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:03:18,154 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:03:28,294 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:03:38,441 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:03:48,599 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:03:58,731 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:04:08,861 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:04:19,017 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:04:29,157 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:04:39,296 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:04:49,435 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:04:59,602 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:05:09,757 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-02 18:05:09,875 6b20bd96 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-02 18:05:09,876 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-02 18:05:09,876 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> delete -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-02 18:05:10,261 6b20bd96 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/config-filebasedioithdfs-1525278645267> delete -f <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-02 18:05:10,452 6b20bd96 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 780, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 646, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-02 18:05:10,453 6b20bd96 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-02 18:05:10,465 6b20bd96 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-02 18:05:10,467 6b20bd96 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/pkb.log>
2018-05-02 18:05:10,467 6b20bd96 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_XmlIOIT_HDFS/ws/runs/6b20bd96/completion_statuses.json>
Build step 'Execute shell' marked build as failure