You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/05/12 00:28:38 UTC

Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #44

See <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/44/display/redirect>

------------------------------------------
[...truncated 322.05 KB...]
    INFO: 2019-05-12T00:22:13.783Z: Executing operation Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    May 12, 2019 12:22:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:13.829Z: Executing operation Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    May 12, 2019 12:22:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:13.876Z: Executing operation Write using Hadoop OutputFormat/GroupDataByPartition/GroupByTaskId/Create
    May 12, 2019 12:22:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:13.876Z: Starting 1 workers in us-central1-a...
    May 12, 2019 12:22:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:13.926Z: Executing operation Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Create
    May 12, 2019 12:22:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:13.982Z: Executing operation Prevent fusion before writing/Reshuffle/GroupByKey/Create
    May 12, 2019 12:22:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:14.348Z: Executing operation Write using Hadoop OutputFormat/CreateOutputConfig/Read(CreateSource)+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    May 12, 2019 12:22:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:14.423Z: Executing operation Generate sequence/Read(BoundedCountingSource)+Produce db rows+Prevent fusion before writing/Pair with random key+Prevent fusion before writing/Reshuffle/Window.Into()/Window.Assign+Prevent fusion before writing/Reshuffle/GroupByKey/Reify+Prevent fusion before writing/Reshuffle/GroupByKey/Write
    May 12, 2019 12:22:46 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:45.064Z: Workers have started successfully.
    May 12, 2019 12:22:57 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:22:56.698Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    May 12, 2019 12:23:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:23:16.608Z: Workers have started successfully.
    May 12, 2019 12:23:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:23:42.885Z: Executing operation Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    May 12, 2019 12:23:43 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:23:43.001Z: Executing operation Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    May 12, 2019 12:24:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:24:13.725Z: Executing operation Prevent fusion before writing/Reshuffle/GroupByKey/Close
    May 12, 2019 12:24:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:24:15.867Z: Executing operation Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    May 12, 2019 12:24:16 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:24:15.973Z: Executing operation Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    May 12, 2019 12:24:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:24:23.866Z: Executing operation Write using Hadoop OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
    May 12, 2019 12:24:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:24:24.134Z: Executing operation Prevent fusion before writing/Reshuffle/GroupByKey/Read+Prevent fusion before writing/Reshuffle/GroupByKey/GroupByWindow+Prevent fusion before writing/Reshuffle/ExpandIterable+Prevent fusion before writing/Values/Values/Map+Collect write time+Construct rows for DBOutputFormat+Write using Hadoop OutputFormat/ParDo(SetupJob)+Write using Hadoop OutputFormat/GroupDataByPartition/AssignTask+Write using Hadoop OutputFormat/GroupDataByPartition/GroupByTaskId/Reify+Write using Hadoop OutputFormat/GroupDataByPartition/GroupByTaskId/Write
    May 12, 2019 12:25:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:25:13.551Z: Executing operation Write using Hadoop OutputFormat/GroupDataByPartition/GroupByTaskId/Close
    May 12, 2019 12:25:14 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:25:13.659Z: Executing operation Write using Hadoop OutputFormat/GroupDataByPartition/GroupByTaskId/Read+Write using Hadoop OutputFormat/GroupDataByPartition/GroupByTaskId/GroupByWindow+Write using Hadoop OutputFormat/GroupDataByPartition/FlattenGroupedTasks+Write using Hadoop OutputFormat/Write+Write using Hadoop OutputFormat/CollectWriteTasks/WithKeys/AddKeys/Map+Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial+Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify+Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Write
    May 12, 2019 12:26:33 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:26:33.940Z: Executing operation Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Close
    May 12, 2019 12:26:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:26:34.036Z: Executing operation Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Read+Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues+Write using Hadoop OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract+Write using Hadoop OutputFormat/CollectWriteTasks/Values/Values/Map+Write using Hadoop OutputFormat/CommitWriteJob
    May 12, 2019 12:26:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:26:38.464Z: Cleaning up.
    May 12, 2019 12:26:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-05-12T00:26:38.568Z: Stopping worker pool...
    May 12, 2019 12:28:35 AM org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0
    WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not cancel it.
    To cancel the job in the cloud, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2019-05-11_17_21_57-12433266396166328197

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > writeAndReadUsingHadoopFormat SKIPPED

> Task :beam-sdks-java-io-hadoop-format:integrationTest FAILED
:beam-sdks-java-io-hadoop-format:integrationTest (Thread[Execution worker for ':' Thread 9,5,main]) completed. Took 6 mins 55.242 secs.

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
89 actionable tasks: 2 executed, 87 up-to-date

STDERR: 
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-io-hadoop-format:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-hadoop-format:integrationTest'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:121)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:117)
	at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:184)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:110)
	at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
	at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
	at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'Gradle Test Executor 1' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
	at org.gradle.api.internal.tasks.testing.worker.ForkingTestClassProcessor.stop(ForkingTestClassProcessor.java:163)
	at org.gradle.api.internal.tasks.testing.processors.RestartEveryNTestClassProcessor.endBatch(RestartEveryNTestClassProcessor.java:77)
	at org.gradle.api.internal.tasks.testing.processors.RestartEveryNTestClassProcessor.stop(RestartEveryNTestClassProcessor.java:62)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.FailureHandlingDispatch.dispatch(FailureHandlingDispatch.java:29)
	at org.gradle.internal.dispatch.AsyncDispatch.dispatchMessages(AsyncDispatch.java:87)
	at org.gradle.internal.dispatch.AsyncDispatch.access$000(AsyncDispatch.java:36)
	at org.gradle.internal.dispatch.AsyncDispatch$1.run(AsyncDispatch.java:71)
	at org.gradle.internal.concurrent.InterruptibleRunnable.run(InterruptibleRunnable.java:42)
	at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
	... 3 more


* Get more help at https://help.gradle.org

BUILD FAILED in 7m 1s
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=52266b0b-1aad-49c2-b104-cdf723e75e15, currentDir=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 791
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-791.out.log
----- Last  20 lines from daemon log file - daemon-791.out.log -----
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
	at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
	at org.gradle.internal.dispatch.FailureHandlingDispatch.dispatch(FailureHandlingDispatch.java:29)
	at org.gradle.internal.dispatch.AsyncDispatch.dispatchMessages(AsyncDispatch.java:87)
	at org.gradle.internal.dispatch.AsyncDispatch.access$000(AsyncDispatch.java:36)
	at org.gradle.internal.dispatch.AsyncDispatch$1.run(AsyncDispatch.java:71)
	at org.gradle.internal.concurrent.InterruptibleRunnable.run(InterruptibleRunnable.java:42)
	at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
	... 3 more


* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 7m 1s
89 actionable tasks: 2 executed, 87 up-to-date
Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)

* Try:
Run with --debug option to get more log output. Run with --scan to get full insights.

* Exception is:
org.gradle.launcher.daemon.client.DaemonDisappearedException: Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed)
	at org.gradle.launcher.daemon.client.DaemonClient.handleDaemonDisappearance(DaemonClient.java:241)
	at org.gradle.launcher.daemon.client.DaemonClient.monitorBuild(DaemonClient.java:217)
	at org.gradle.launcher.daemon.client.DaemonClient.executeBuild(DaemonClient.java:179)
	at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:142)
	at org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:94)
	at org.gradle.launcher.cli.RunBuildAction.run(RunBuildAction.java:55)
	at org.gradle.internal.Actions$RunnableActionAdapter.execute(Actions.java:207)
	at org.gradle.launcher.cli.CommandLineActionFactory$ParseAndBuildAction.execute(CommandLineActionFactory.java:403)
	at org.gradle.launcher.cli.CommandLineActionFactory$ParseAndBuildAction.execute(CommandLineActionFactory.java:376)
	at org.gradle.launcher.cli.ExceptionReportingAction.execute(ExceptionReportingAction.java:37)
	at org.gradle.launcher.cli.ExceptionReportingAction.execute(ExceptionReportingAction.java:23)
	at org.gradle.launcher.cli.CommandLineActionFactory$WithLogging.execute(CommandLineActionFactory.java:369)
	at org.gradle.launcher.cli.CommandLineActionFactory$WithLogging.execute(CommandLineActionFactory.java:299)
	at org.gradle.launcher.Main.doAction(Main.java:36)
	at org.gradle.launcher.bootstrap.EntryPoint.run(EntryPoint.java:45)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.launcher.bootstrap.ProcessBootstrap.runNoExit(ProcessBootstrap.java:60)
	at org.gradle.launcher.bootstrap.ProcessBootstrap.run(ProcessBootstrap.java:37)
	at org.gradle.launcher.GradleMain.main(GradleMain.java:23)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.wrapper.BootstrapMainStarter.start(BootstrapMainStarter.java:31)
	at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:108)
	at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:61)


* Get more help at https://help.gradle.org

2019-05-12 00:28:37,337 9bf2873d MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-12 00:28:37,339 9bf2873d MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-05-12 00:28:37,339 9bf2873d MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-44> delete -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml> --ignore-not-found
2019-05-12 00:28:37,648 9bf2873d MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-12 00:28:37,649 9bf2873d MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-05-12 00:28:37,649 9bf2873d MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-12 00:28:37,649 9bf2873d MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/9bf2873d/pkb.log>
2019-05-12 00:28:37,650 9bf2873d MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/9bf2873d/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_HadoopFormat #45

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/45/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org