You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/05/13 18:04:54 UTC

Build failed in Jenkins: beam_PerformanceTests_TextIOIT_HDFS #171

See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/171/display/redirect>

------------------------------------------
[...truncated 51.00 KB...]
> Task :beam-sdks-java-build-tools:jar
> Task :beam-sdks-java-maven-archetypes-starter:processTestResources
> Task :beam-model-pipeline:extractIncludeProto
> Task :beam-model-pipeline:extractProto
> Task :beam-sdks-java-io-elasticsearch-tests-common:shadowJar
> Task :beam-sdks-java-io-elasticsearch-tests-common:processTestResources NO-SOURCE
> Task :beam-sdks-java-io-elasticsearch-tests-common:jar
> Task :beam-sdks-java-maven-archetypes-examples:generateSources
> Task :beam-sdks-java-extensions-sql:copyFmppTemplatesFromCalciteCore
> Task :beam-sdks-java-extensions-sql:copyFmppTemplatesFromSrc
> Task :beam-sdks-java-maven-archetypes-examples:processResources
> Task :beam-sdks-java-maven-archetypes-examples:processTestResources
> Task :beam-model-pipeline:generateProto

> Task :beam-runners-apex:buildDependencyTree
See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/ed7c8f56/beam/runners/apex/build/classes/java/main/org/apache/beam/runners/apex/dependency-tree>

> Task :beam-runners-apex:processResources NO-SOURCE
> Task :beam-runners-apex:processTestResources
> Task :beam-sdks-java-build-tools:compileTestJava FROM-CACHE
> Task :beam-sdks-java-build-tools:processTestResources NO-SOURCE
> Task :beam-sdks-java-build-tools:testClasses UP-TO-DATE
> Task :beam-sdks-java-build-tools:packageTests
> Task :beam-model-pipeline:compileJava FROM-CACHE
> Task :beam-model-pipeline:processResources
> Task :beam-model-pipeline:classes

> Task :beam-sdks-python:setupVirtualenv
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/ed7c8f56/beam/sdks/python/build/gradleenv/bin/python2>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/ed7c8f56/beam/sdks/python/build/gradleenv/bin/python>
Installing setuptools, pkg_resources, pip, wheel...
> Task :beam-sdks-java-extensions-sql:generateFmppSources
> Task :beam-sdks-java-build-tools:install

> Task :beam-sdks-python:setupVirtualenv
done.
Running virtualenv with interpreter /usr/bin/python2
Requirement already up-to-date: tox==3.0.0 in /home/jenkins/.local/lib/python2.7/site-packages (3.0.0)
Collecting grpcio-tools==1.3.5

> Task :beam-sdks-java-core:compileJava FROM-CACHE
> Task :beam-sdks-java-core:processResources
> Task :beam-sdks-java-core:classes

> Task :beam-sdks-java-extensions-sql:compileJavacc
Note: UNICODE_INPUT option is specified. Please make sure you create the parser/lexer using a Reader with the correct character encoding.
Warning: Lookahead adequacy checking not being performed since option LOOKAHEAD is more than 1.  Set option FORCE_LA_CHECK to true to force checking.
File "TokenMgrError.java" does not exist.  Will create one.
File "ParseException.java" does not exist.  Will create one.
File "Token.java" does not exist.  Will create one.
File "SimpleCharStream.java" does not exist.  Will create one.
Parser generated with 0 errors and 1 warnings.

> Task :beam-model-pipeline:shadowJar
> Task :beam-model-pipeline:jar
> Task :beam-model-job-management:extractIncludeProto
> Task :beam-model-fn-execution:extractIncludeProto
> Task :beam-model-pipeline:extractIncludeTestProto
> Task :beam-model-pipeline:extractTestProto
> Task :beam-model-pipeline:generateTestProto NO-SOURCE
> Task :beam-model-pipeline:compileTestJava NO-SOURCE
> Task :beam-model-pipeline:processTestResources NO-SOURCE
> Task :beam-model-pipeline:testClasses UP-TO-DATE
> Task :beam-model-pipeline:packageTests
> Task :beam-model-job-management:generateProto
> Task :beam-model-fn-execution:generateProto
> Task :beam-model-pipeline:install
> Task :beam-sdks-java-extensions-sql:processResources
> Task :beam-sdks-java-extensions-sql:processTestResources NO-SOURCE
> Task :beam-model-job-management:compileJava FROM-CACHE
> Task :beam-model-job-management:classes
> Task :beam-model-fn-execution:compileJava FROM-CACHE
> Task :beam-model-fn-execution:classes

> Task :beam-sdks-python:setupVirtualenv
  Using cached https://files.pythonhosted.org/packages/05/f6/0296e29b1bac6f85d2a8556d48adf825307f73109a3c2c17fb734292db0a/grpcio_tools-1.3.5-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: pluggy<1.0,>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (0.6.0)
Requirement not upgraded as not directly required: six in /usr/local/lib/python2.7/dist-packages (from tox==3.0.0) (1.11.0)
Requirement not upgraded as not directly required: virtualenv>=1.11.2 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (15.2.0)
Requirement not upgraded as not directly required: py>=1.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from tox==3.0.0) (1.5.3)
Collecting grpcio>=1.3.5 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/0d/54/b647a6323be6526be27b2c90bb042769f1a7a6e59bd1a5f2eeb795bfece4/grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting protobuf>=3.2.0 (from grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/9d/61/54c3a9cfde6ffe0ca6a1786ddb8874263f4ca32e7693ad383bd8cf935015/protobuf-3.5.2.post1-cp27-cp27mu-manylinux1_x86_64.whl
Requirement not upgraded as not directly required: enum34>=1.0.4 in /usr/local/lib/python2.7/dist-packages (from grpcio>=1.3.5->grpcio-tools==1.3.5) (1.1.6)
Collecting futures>=2.2.0 (from grpcio>=1.3.5->grpcio-tools==1.3.5)
  Using cached https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Requirement not upgraded as not directly required: setuptools in /usr/local/lib/python2.7/dist-packages (from protobuf>=3.2.0->grpcio-tools==1.3.5) (39.0.1)
Installing collected packages: protobuf, futures, grpcio, grpcio-tools
Could not install packages due to an EnvironmentError: [Errno 13] Permission denied: '/usr/local/lib/python2.7/dist-packages/protobuf-3.5.2.post1-py2.7-nspkg.pth'
Consider using the `--user` option or check the permissions.


> Task :beam-sdks-python:setupVirtualenv FAILED
> Task :beam-model-job-management:shadowJar
> Task :beam-model-fn-execution:shadowJar
> Task :beam-sdks-java-core:shadowJar

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
See https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 51s
132 actionable tasks: 125 executed, 5 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/qtepjjhhe4zao


STDERR: 
FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/ed7c8f56/beam/sdks/python/build.gradle'> line: 36

* What went wrong:
Execution failed for task ':beam-sdks-python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-python:setupVirtualenv'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
	at org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
	at org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
	at org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
	at org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
	at org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
	at org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'command 'sh'' finished with non-zero exit value 1
	at org.gradle.process.internal.DefaultExecHandle$ExecResultImpl.assertNormalExitValue(DefaultExecHandle.java:389)
	at org.gradle.process.internal.DefaultExecAction.execute(DefaultExecAction.java:36)
	at org.gradle.api.internal.file.DefaultFileOperations.exec(DefaultFileOperations.java:192)
	at org.gradle.api.internal.project.DefaultProject.exec(DefaultProject.java:1087)
	at org.gradle.groovy.scripts.DefaultScript.exec(DefaultScript.java:253)
	at org.gradle.internal.metaobject.BeanDynamicObject$MetaClassAdapter.invokeMethod(BeanDynamicObject.java:479)
	at org.gradle.internal.metaobject.BeanDynamicObject.tryInvokeMethod(BeanDynamicObject.java:191)
	at org.gradle.groovy.scripts.BasicScript$ScriptDynamicObject.tryInvokeMethod(BasicScript.java:130)
	at org.gradle.internal.metaobject.ConfigureDelegate.invokeMethod(ConfigureDelegate.java:78)
	at build_en7xjm8u9daypwt989omrd03d$_run_closure2$_closure22.doCall(<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/ed7c8f56/beam/sdks/python/build.gradle>:36)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:732)
	at org.gradle.api.internal.AbstractTask$ClosureTaskAction.execute(AbstractTask.java:705)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
	... 31 more


* Get more help at https://help.gradle.org

2018-05-13 18:01:49,872 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml>
2018-05-13 18:01:50,255 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> create -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml>
2018-05-13 18:01:50,534 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2018-05-13 18:01:50,541 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:02:00,728 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:02:10,904 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:02:21,072 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:02:31,264 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:02:41,452 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:02:51,620 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:03:01,820 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:03:11,964 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:03:22,157 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:03:32,280 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:03:42,452 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:03:52,624 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:04:02,755 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:04:12,880 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:04:23,040 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:04:33,240 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:04:43,380 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:04:53,504 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> get svc hadoop-external -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-05-13 18:04:53,642 ed7c8f56 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 655, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 535, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-13 18:04:53,643 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-05-13 18:04:53,643 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster.yml> --ignore-not-found
2018-05-13 18:04:54,025 ed7c8f56 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/config-filebasedioithdfs-1526223664075> delete -f <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/SmallITCluster/hdfs-single-datanode-cluster-for-local-dev.yml> --ignore-not-found
2018-05-13 18:04:54,184 ed7c8f56 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 789, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 655, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 535, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-05-13 18:04:54,185 ed7c8f56 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-05-13 18:04:54,202 ed7c8f56 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-13 18:04:54,203 ed7c8f56 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/ed7c8f56/pkb.log>
2018-05-13 18:04:54,203 ed7c8f56 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/ws/runs/ed7c8f56/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PerformanceTests_TextIOIT_HDFS #172

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_TextIOIT_HDFS/172/display/redirect>