You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/04/14 06:18:15 UTC

Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #1499

See <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/1499/display/redirect>

------------------------------------------
[...truncated 280.17 KB...]
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s64
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s65
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s66
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s67
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s68
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map as step s69
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey as step s70
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues as step s71
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map as step s72
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) as step s73
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly as step s74
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) as step s75
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView as step s76
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Delete test files as step s77
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/avroioit0writethenreadall-jenkins-0414061547-4aa93df4/output/results/staging/
    Apr 14, 2019 6:16:07 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
    INFO: Uploading <189688 bytes, hash gFxKLere2amx5vqb3JTPrQ> to gs://temp-storage-for-perf-tests/avroioit0writethenreadall-jenkins-0414061547-4aa93df4/output/results/staging/pipeline-gFxKLere2amx5vqb3JTPrQ.pb

org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_OUT
    Dataflow SDK version: 2.13.0-SNAPSHOT

org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_ERROR
    Apr 14, 2019 6:16:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_23_16_08-3019148559259387985?project=apache-beam-testing

org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_OUT
    Submitted job: 2019-04-13_23_16_08-3019148559259387985

org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_ERROR
    Apr 14, 2019 6:16:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2019-04-13_23_16_08-3019148559259387985
    Apr 14, 2019 6:16:09 AM org.apache.beam.runners.dataflow.TestDataflowRunner run
    INFO: Running Dataflow job 2019-04-13_23_16_08-3019148559259387985 with 1 expected assertions.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T06:16:08.118Z: Autoscaling is enabled for job 2019-04-13_23_16_08-3019148559259387985. The number of workers will be between 1 and 1000.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T06:16:08.164Z: Autoscaling was automatically enabled for job 2019-04-13_23_16_08-3019148559259387985.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T06:16:11.182Z: Checking permissions granted to controller Service Account.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T06:16:18.111Z: Worker configuration: n1-standard-1 in us-central1-b.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-04-14T06:16:18.636Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/7238 instances, 1/0 CPUs, 250/116479 disk GB, 0/4046 SSD disk GB, 1/194 instance groups, 1/194 managed instance groups, 1/132 instance templates, 1/493 in-use IP addresses.

    Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T06:16:18.740Z: Cleaning up.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T06:16:18.799Z: Worker pool stopped.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.TestDataflowRunner$ErrorMonitorMessagesHandler process
    INFO: Dataflow job 2019-04-13_23_16_08-3019148559259387985 threw exception. Failure message was: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/7238 instances, 1/0 CPUs, 250/116479 disk GB, 0/4046 SSD disk GB, 1/194 instance groups, 1/194 managed instance groups, 1/132 instance templates, 1/493 in-use IP addresses.

    Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2019-04-13_23_16_08-3019148559259387985 failed with status FAILED.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    WARNING: Metrics not present for Dataflow job 2019-04-13_23_16_08-3019148559259387985.
    Apr 14, 2019 6:16:24 AM org.apache.beam.runners.dataflow.TestDataflowRunner run
    WARNING: Dataflow job 2019-04-13_23_16_08-3019148559259387985 did not output a success or failure metric.

Gradle Test Executor 1 finished executing tests.

> Task :beam-sdks-java-io-file-based-io-tests:integrationTest FAILED

org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll FAILED
    java.lang.RuntimeException: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/7238 instances, 1/0 CPUs, 250/116479 disk GB, 0/4046 SSD disk GB, 1/194 instance groups, 1/194 managed instance groups, 1/132 instance templates, 1/493 in-use IP addresses.

    Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota.
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331)
        at org.apache.beam.sdk.io.avro.AvroIOIT.writeThenReadAll(AvroIOIT.java:147)
Finished generating test XML results (0.019 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Execution worker for ':' Thread 13,5,main]) completed. Took 43.46 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
93 actionable tasks: 1 executed, 92 up-to-date

Publishing build scan...
https://gradle.com/s/g2smoyzw5pelo


STDERR: 
1 test completed, 1 failed

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:121)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:117)
	at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:184)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:110)
	at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
	at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
	at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:626)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:498)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:587)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:48)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:705)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:672)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$4.run(ExecuteActionsTaskExecuter.java:338)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:327)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:312)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:75)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:158)
	at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:46)
	at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)
	at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
	at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:49)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:42)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:28)
	at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:133)
	at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$5(CacheStep.java:83)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:37)
	at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:88)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:52)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
	at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109)
	... 40 more


* Get more help at https://help.gradle.org

BUILD FAILED in 56s

2019-04-14 06:16:26,115 1164c17b MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 754, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 605, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-04-14 06:16:26,116 1164c17b MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-04-14 06:16:26,117 1164c17b MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1499> delete -f <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> --ignore-not-found
2019-04-14 06:18:15,183 1164c17b MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 897, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 754, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 605, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-04-14 06:18:15,183 1164c17b MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-04-14 06:18:15,184 1164c17b MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-04-14 06:18:15,184 1164c17b MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/1164c17b/pkb.log>
2019-04-14 06:18:15,184 1164c17b MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/1164c17b/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_AvroIOIT_HDFS #1502

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/1502/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #1501

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/1501/display/redirect>

------------------------------------------
[...truncated 311.72 KB...]
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.416Z: Fusing unzipped copy of Write Avro records to files/Write/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow), through flatten Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/Flatten.PCollections, into producer Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.450Z: Fusing consumer Write Avro records to files/Write/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow) into Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/DropShardNum
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.498Z: Fusing consumer Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.543Z: Fusing consumer Write Avro records to files/Write/RewindowIntoGlobal/Window.Assign into Collect start time
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.591Z: Fusing consumer Produce text lines into Generate sequence/Read(BoundedCountingSource)
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.639Z: Fusing consumer Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow into Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.682Z: Fusing consumer Collect start time into Produce Avro records
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.720Z: Fusing consumer Produce Avro records into Produce text lines
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.768Z: Fusing consumer Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/WriteUnwritten into Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.814Z: Fusing consumer Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into Write Avro records to files/Write/RewindowIntoGlobal/Window.Assign
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.853Z: Fusing consumer Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/DropShardNum into Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/WriteUnwritten
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.888Z: Fusing consumer Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.925Z: Fusing consumer PAssert$0/GroupGlobally/GroupDummyAndContents/Reify into PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:53.972Z: Fusing consumer PAssert$0/GroupGlobally/GroupDummyAndContents/Write into PAssert$0/GroupGlobally/GroupDummyAndContents/Reify
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.020Z: Fusing consumer PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.512Z: Executing operation Write Avro records to files/Write/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.553Z: Executing operation Read all files/FileIO.MatchAll/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.594Z: Executing operation Read all files/Read all via FileBasedSource/Reshuffle/Reshuffle/GroupByKey/Create
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.616Z: Starting 1 workers in us-central1-a...
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.644Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.691Z: Executing operation PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Create
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.744Z: Executing operation PAssert$0/GroupGlobally/GroupDummyAndContents/Create
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.792Z: Executing operation Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.838Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    Apr 14, 2019 6:18:54 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:54.879Z: Executing operation View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Apr 14, 2019 6:18:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:55.285Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
    Apr 14, 2019 6:18:57 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:18:55.323Z: Executing operation Generate sequence/Read(BoundedCountingSource)+Produce text lines+Produce Avro records+Collect start time+Write Avro records to files/Write/RewindowIntoGlobal/Window.Assign+Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write Avro records to files/Write/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write Avro records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
    Apr 14, 2019 6:19:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:19:09.763Z: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
    Apr 14, 2019 6:19:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-04-14T18:19:16.954Z: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
    Apr 14, 2019 6:19:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-04-14T18:19:17.005Z: Workflow failed. Causes: Internal Issue (47ae241bffbe5fc6): 82159483:17
    Apr 14, 2019 6:19:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:19:17.197Z: Cleaning up.
    Apr 14, 2019 6:19:17 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:19:17.307Z: Stopping worker pool...
    Apr 14, 2019 6:19:17 PM org.apache.beam.runners.dataflow.TestDataflowRunner$ErrorMonitorMessagesHandler process
    INFO: Dataflow job 2019-04-14_11_18_38-15204688572829050205 threw exception. Failure message was: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.
    Apr 14, 2019 6:19:17 PM org.apache.beam.runners.dataflow.TestDataflowRunner$ErrorMonitorMessagesHandler process
    INFO: Dataflow job 2019-04-14_11_18_38-15204688572829050205 threw exception. Failure message was: Workflow failed. Causes: Internal Issue (47ae241bffbe5fc6): 82159483:17
    Apr 14, 2019 6:19:30 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-04-14T18:19:27.549Z: Worker pool stopped.
    Apr 14, 2019 6:19:33 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    INFO: Job 2019-04-14_11_18_38-15204688572829050205 failed with status FAILED.
    Apr 14, 2019 6:19:33 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess
    INFO: Dataflow job 2019-04-14_11_18_38-15204688572829050205 terminated in failure state FAILED without reporting a failed assertion
    Apr 14, 2019 6:19:33 PM org.apache.beam.runners.dataflow.TestDataflowRunner run
    WARNING: Dataflow job 2019-04-14_11_18_38-15204688572829050205 did not output a success or failure metric.

Gradle Test Executor 1 finished executing tests.

> Task :beam-sdks-java-io-file-based-io-tests:integrationTest FAILED

org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll FAILED
    java.lang.RuntimeException: Startup of the worker pool in zone us-central1-a failed to bring up any of the desired 1 workers. Please check for errors in your job parameters, check quota and retry later, or please try in a different zone/region.Workflow failed. Causes: Internal Issue (47ae241bffbe5fc6): 82159483:17
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331)
        at org.apache.beam.sdk.io.avro.AvroIOIT.writeThenReadAll(AvroIOIT.java:147)
Finished generating test XML results (0.024 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.032 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
:beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 1 mins 10.459 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
93 actionable tasks: 1 executed, 92 up-to-date

Publishing build scan...
https://gradle.com/s/l4olsq4aksbls


STDERR: 
1 test completed, 1 failed

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
> There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:121)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:117)
	at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:184)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:110)
	at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
	at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
	at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
	at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:626)
	at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:498)
	at org.gradle.api.tasks.testing.Test.executeTests(Test.java:587)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:48)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:705)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:672)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$4.run(ExecuteActionsTaskExecuter.java:338)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:327)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:312)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:75)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:158)
	at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:46)
	at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)
	at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
	at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:49)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:42)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:28)
	at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:133)
	at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$5(CacheStep.java:83)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:37)
	at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:88)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:52)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
	at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109)
	... 40 more


* Get more help at https://help.gradle.org

BUILD FAILED in 1m 23s

2019-04-14 18:19:34,604 2f164d47 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 754, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 605, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-04-14 18:19:34,605 2f164d47 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-04-14 18:19:34,605 2f164d47 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1501> delete -f <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> --ignore-not-found
2019-04-14 18:21:15,174 2f164d47 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 897, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 754, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 605, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-04-14 18:21:15,174 2f164d47 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-04-14 18:21:15,175 2f164d47 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-04-14 18:21:15,175 2f164d47 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/2f164d47/pkb.log>
2019-04-14 18:21:15,175 2f164d47 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/2f164d47/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PerformanceTests_AvroIOIT_HDFS #1500

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/1500/display/redirect>

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-9 (beam) in workspace <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/>
No credentials specified
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 98faabff70b58935b7cbc7efeacc8ec1a850479d (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 98faabff70b58935b7cbc7efeacc8ec1a850479d
Commit message: "[BEAM-6942]  Make modifications to pipeline options to be visible to all views. (#8225)"
 > git rev-list --no-walk 98faabff70b58935b7cbc7efeacc8ec1a850479d # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins3471042094436849321.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running [gcloud.container.clusters.get-credentials] with arguments: [--verbosity: "debug", --zone: "us-central1-a", NAME: "io-datastores"]
Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format: "default"
DEBUG: SDK update checks are disabled.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins6745981975314777580.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1500>
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5502292445553661367.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1500> create namespace beam-performancetests-avroioit-hdfs-1500
namespace/beam-performancetests-avroioit-hdfs-1500 created
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5238836410102084290.sh
++ kubectl config current-context
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1500> config set-context gke_apache-beam-testing_us-central1-a_io-datastores --namespace=beam-performancetests-avroioit-hdfs-1500
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins4728545060724451278.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker>
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5132446043049519534.sh
+ rm -rf <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env>
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins178044864975334476.sh
+ virtualenv <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env>
Using base prefix '/usr'
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/python3>
Also creating executable in <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5080838230970214269.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/pip> install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./env/.perfkit_env/lib/python3.5/site-packages (41.0.0)
Requirement already up-to-date: pip in ./env/.perfkit_env/lib/python3.5/site-packages (19.0.3)
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins5472742254374750870.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker>
Cloning into '<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_AvroIOIT_HDFS] $ /bin/bash -xe /tmp/jenkins7095047829474353056.sh
+ <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/env/.perfkit_env/bin/pip> install -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt>
Collecting absl-py (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 14))
  Using cached https://files.pythonhosted.org/packages/da/3f/9b0355080b81b15ba6a9ffcf1f5ea39e307a2778b2f2dc8694724e8abd5b/absl-py-0.7.1.tar.gz
Collecting jinja2>=2.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 15))
  Using cached https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in ./env/.perfkit_env/lib/python3.5/site-packages (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 16)) (41.0.0)
Collecting colorlog[windows]==2.6.0 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 17))
  Using cached https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 18))
  Using cached https://files.pythonhosted.org/packages/1b/51/e2a9f3b757eb802f61dc1f2b09c8c99f6eb01cf06416c0671253536517b6/blinker-1.4.tar.gz
Collecting futures>=3.0.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 19))
  Using cached https://files.pythonhosted.org/packages/cc/26/b61e3a4eb50653e8a7339d84eeaa46d1e93b92951978873c220ae64d0733/futures-3.1.1.tar.gz
Collecting PyYAML==3.12 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 20))
  Using cached https://files.pythonhosted.org/packages/4a/85/db5a2df477072b2902b0eb892feb37d88ac635d36245a72a6a69b23b383a/PyYAML-3.12.tar.gz
Collecting pint>=0.7 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 21))
  Using cached https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
Collecting numpy==1.13.3 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 22))
  Using cached https://files.pythonhosted.org/packages/0d/41/6c224571decd61c2578baedfdb0eec6283617c6679c35b20973f4e68aeaf/numpy-1.13.3-cp35-cp35m-manylinux1_x86_64.whl
Collecting functools32 (from -r <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/requirements.txt> (line 23))
  Using cached https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz
    Complete output from command python setup.py egg_info:
    This backport is for Python 2.7 only.
    
    ----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-install-9pzs7kry/functools32/
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org