You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/03/07 15:14:41 UTC
Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #7
See <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/7/display/redirect>
------------------------------------------
[...truncated 142.75 KB...]
[INFO] Excluding io.grpc:grpc-protobuf-nano:jar:1.2.0 from the shaded jar.
[INFO] Excluding com.google.protobuf.nano:protobuf-javanano:jar:3.0.0-alpha-5 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-core:jar:1.0.2 from the shaded jar.
[INFO] Excluding org.json:json:jar:20160810 from the shaded jar.
[INFO] Excluding com.google.cloud:google-cloud-spanner:jar:0.20.0b-beta from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-v1:jar:0.1.11b from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-v1:jar:0.1.11b from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-admin-database-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-cloud-spanner-admin-instance-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.api.grpc:grpc-google-longrunning-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.opencensus:opencensus-api:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/beam-sdks-java-io-hadoop-input-format-2.5.0-SNAPSHOT.jar> with <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/beam-sdks-java-io-hadoop-input-format-2.5.0-SNAPSHOT-shaded.jar>
[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/beam-sdks-java-io-hadoop-input-format-2.5.0-SNAPSHOT-tests.jar> with <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/beam-sdks-java-io-hadoop-input-format-2.5.0-SNAPSHOT-shaded-tests.jar>
[INFO] Dependency-reduced POM written at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/dependency-reduced-pom.xml>
[INFO]
[INFO] --- maven-failsafe-plugin:2.20.1:integration-test (default) @ beam-sdks-java-io-hadoop-input-format ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports>
[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, threadCountMethods=0, parallelOptimized=true
[INFO]
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 652.727 s <<< FAILURE! - in org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT
[ERROR] readUsingHadoopInputFormat(org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT) Time elapsed: 652.727 s <<< ERROR!
java.lang.RuntimeException:
(7658347bb58166cb): java.lang.RuntimeException: java.lang.RuntimeException: org.postgresql.util.PSQLException: The connection attempt failed.
at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.setConf(DBInputFormat.java:171)
at org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO$HadoopInputFormatBoundedSource.createInputFormatInstance(HadoopInputFormatIO.java:528)
at org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO$HadoopInputFormatBoundedSource.computeSplitsIfNecessary(HadoopInputFormatIO.java:484)
at org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIO$HadoopInputFormatBoundedSource.split(HadoopInputFormatIO.java:441)
at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:75)
at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: org.postgresql.util.PSQLException: The connection attempt failed.
at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.createConnection(DBInputFormat.java:205)
at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.setConf(DBInputFormat.java:164)
... 18 more
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:215)
at org.postgresql.Driver.makeConnection(Driver.java:404)
at org.postgresql.Driver.connect(Driver.java:272)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.hadoop.mapreduce.lib.db.DBConfiguration.getConnection(DBConfiguration.java:154)
at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.createConnection(DBInputFormat.java:198)
... 19 more
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.postgresql.core.PGStream.<init>(PGStream.java:61)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:144)
... 27 more
at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:133)
at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:89)
at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:54)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:353)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
at org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT.readUsingHadoopInputFormat(HadoopInputFormatIOIT.java:149)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:324)
at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:324)
at org.junit.rules.RunRules.evaluate(RunRules.java:20)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
[INFO]
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR] HadoopInputFormatIOIT.readUsingHadoopInputFormat:149 Runtime (7658347bb58166...
[INFO]
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO]
[INFO]
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-hadoop-input-format ---
[INFO] No dependency problems found
[INFO]
[INFO] --- maven-failsafe-plugin:2.20.1:verify (default) @ beam-sdks-java-io-hadoop-input-format ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:02 min
[INFO] Finished at: 2018-03-07T15:14:39Z
[INFO] Final Memory: 131M/1623M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.20.1:verify (default) on project beam-sdks-java-io-hadoop-input-format: There are test failures.
[ERROR]
[ERROR] Please refer to <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports> for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.20.1:verify (default) on project beam-sdks-java-io-hadoop-input-format: There are test failures.
Please refer to <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.
Please refer to <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
at org.apache.maven.plugin.surefire.SurefireHelper.throwException (SurefireHelper.java:240)
at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:112)
at org.apache.maven.plugin.failsafe.VerifyMojo.execute (VerifyMojo.java:188)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
STDERR:
2018-03-07 15:14:39,878 7b5efd9d MainThread beam_integration_benchmark(1/1) ERROR Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 622, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 525, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
job_type=job_type)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-07 15:14:39,879 7b5efd9d MainThread beam_integration_benchmark(1/1) INFO Cleaning up benchmark beam_integration_benchmark
2018-03-07 15:14:39,879 7b5efd9d MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1520434662298> delete -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml>
2018-03-07 15:14:40,965 7b5efd9d MainThread beam_integration_benchmark(1/1) ERROR Exception running benchmark
Traceback (most recent call last):
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 732, in RunBenchmarkTask
RunBenchmark(spec, collector)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 622, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 525, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
job_type=job_type)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-07 15:14:40,966 7b5efd9d MainThread beam_integration_benchmark(1/1) ERROR Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-03-07 15:14:40,966 7b5efd9d MainThread beam_integration_benchmark(1/1) INFO Benchmark run statuses:
---------------------------------------------------------------------------------
Name UID Status Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark beam_integration_benchmark0 FAILED
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-03-07 15:14:40,966 7b5efd9d MainThread beam_integration_benchmark(1/1) INFO Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/pkb.log>
2018-03-07 15:14:40,967 7b5efd9d MainThread beam_integration_benchmark(1/1) INFO Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/7b5efd9d/completion_statuses.json>
Build step 'Execute shell' marked build as failure
Jenkins build is back to normal :
beam_PerformanceTests_HadoopInputFormat #9
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/9/display/redirect?page=changes>
Build failed in Jenkins: beam_PerformanceTests_HadoopInputFormat #8
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/8/display/redirect?page=changes>
Changes:
[herohde] Initial sketches of a Go SDK
[herohde] Initial version of the direct style w/ direct runner. Incomplete.
[herohde] Add Data as UserFn context w/ immediate value.
[herohde] Added no-I/O wordcount for profiling.
[herohde] Fleshed out possible approach to generic transformations.
[herohde] Add “dag” example that use multiplexing and side input.
[herohde] Added a more complex DAG example.
[herohde] Add yatzy example with more complex construction-time setup
[herohde] Add proto for Fn API
[herohde] Add beam.Composite helper for the most common pattern to align with java
[herohde] Move pipeline-construction time errors into an accumulator
[herohde] Add Dataflow job and Fn API clients. Incomplete.
[herohde] Add binary cross-compile and upload to Dataflow runner. Incomplete.
[herohde] Add tentative runner indirection (default: local).
[herohde] Made data flow runner detect user main for cross-compilation.
[herohde] Remove error accumulation in favor of panic.
[herohde] Improve Dataflow translation of coders, side input and composite names.
[herohde] Fix name for AsView.
[herohde] Use 2 grpc endpoints in harness
[herohde] Add gRPC harness logging
[herohde] Flesh out harness and serialization further.
[herohde] Made the dataflow runner wait for job termination by default
[herohde] beam:
[herohde] beam:
[herohde] combinefn.go: fix compilation issues
[herohde] Improve dataflow serialization and execution. Incomplete.
[herohde] Sleep 30 sec in wordcap to allow logs to propagate to Cloud Logging.
[herohde] Move the 30s sleep for logging to the harness instead of in WordCap.
[herohde] Post-review updates.
[herohde] Doc updates.
[herohde] Flesh out coders. Incomplete.
[herohde] Added prototype implementation of more coders and the runner source.
[herohde] dofn: illustrates how dofns are written.
[herohde] beam: add viewfn and windowfn to side inputs match support Beam 1.0
[herohde] dofn: timers
[herohde] Complete revamp: coders, graph and execution use element-wise
[herohde] Fix coder encoding for Dataflow side input. Otherwise, the job is
[herohde] Added more godoc comments to graph types.
[herohde] Added more comments plus made local GBK use coder equality.
[herohde] Added Flatten support and “forest” example that uses it.
[herohde] Move bigqueryio to defunct
[herohde] Make forest example print less
[herohde] Add external struct registry and serialization.
[herohde] Updated comments in node.go.
[herohde] Replace real type with 'full type' since that's the current term.
[herohde] Refactor Fn API dependency.
[herohde] Added more comments to the runner/dataflow and runner/beamexec packages
[herohde] Fix most go vet issues
[herohde] Make core operations panic to cut down on the error propagation
[herohde] Add more comments to the graph package.
[herohde] Add DoFn wrapper to handle either function or (ptr to) struct
[herohde] Fix remaining go vet warnings.
[herohde] Code review for beam/graph/coder package.
[herohde] Code review of the runtime/graphx package.
[herohde] Remove Data options in favor of using a Fn struct
[herohde] Code review of the beam/graph/userfn package.
[herohde] Code review for beam/graph package.
[herohde] godoc for runtime/graphx
[herohde] Add support for []T and Combine functions
[herohde] Add adapted documentation from the Java SDK to the beam package
[herohde] Update snapshot of Fn API.
[herohde] Add experiments flag to the Dataflow runner
[herohde] Remove context arg from beamexec.Init
[herohde] Migration to Runner API.
[herohde] Add support for creating DOT graphs.
[herohde] Make pretty printing of types and coders more concise
[herohde] Add flexible Signature to aid type checking
[herohde] Adding unit testability to harness translation.
[herohde] Fix crash due to initialization order
[herohde] Add CreateValues and Impulse
[herohde] Add Runner API support for WindowingStrategy.
[herohde] Run goimports on baseline.
[herohde] Fix encoding of global window strategy.
[herohde] Ensure the windowed value is atomically encoded.
[herohde] Limit gRPC messages to max size.
[herohde] Developer conveniences for running jobs.
[herohde] Fix sends to not close the network channel.
[herohde] Add re-iterable side input
[herohde] Add per-key Combine
[herohde] Add Min
[herohde] Reorganize non-user-facing code into core
[herohde] Make type register reject unnamed or predeclared types
[herohde] Add type specialization tool
[herohde] Don't run grpc plugin in generate phase.
[herohde] Fix import reference path for runner API proto.
[herohde] Revamp runner registration as _ imports
[herohde] Add stats.Max and Mean
[herohde] Add global pipeline options
[herohde] Unify global and per-key combiners
[herohde] Add beam convenience wrapper for imports and runner selection
[herohde] Add session recording and CPU profiling to harness.
[herohde] Add ptest and passert for testing pipelines
[herohde] Add GCS and glob support to textio
[herohde] Add BigQuery IO and examples
[herohde] Adds a session runner for testing.
[herohde] Add Partition and dynamic functions
[herohde] Adding example that returns 10 words that contain provided search
[herohde] Remove duplicate LOG line
[herohde] Enable Combine Fns in Dataflow runner by modifying translation.
[herohde] Fixing type bug by dropping T and using underlying type of value in
[herohde] Adding graph validation at build time.
[herohde] Import the Fn API changes.
[herohde] Simple changes to support new Fn API coder changes.
[herohde] Update translator to work with new Fn API changes.
[herohde] Use appropriate equality tests.
[herohde] Fix test to not use path of package.
[herohde] Renaming directory to match package name.
[herohde] Fixing random nits in comments.
[herohde] Modify build command to avoid bash.
[herohde] Fixing selected golint issues.
[herohde] Addressing import review comments.
[herohde] Add coder specialization for bytes/strings.
[herohde] Adding unit tests to stats.
[herohde] Fixing typo.
[herohde] Add beam.External
[herohde] Fix grpc.Dial calls to block properly.
[herohde] Creates a symtab verifier by running Sym2Addr and Addr2Sym in a binary.
[herohde] Add spec field to help interpretation of payload.
[herohde] Use beam.T alias for typex.T etc with Go 1.9
[herohde] Move shared GCP options to a separate package
[herohde] Update portability protos
[herohde] Remove old source/sink from beam package
[herohde] Add context-aware logging for both pipeline-submission time and runtime
[herohde] Fix coder inference for strings.
[herohde] Improve tornadoes example
[herohde] Fix beam.External to map arguments correctly.
[herohde] Added comments to yatzy and forest
[herohde] Add comments to tornadoes from the java counterpart
[herohde] Rename Pipeline Composite to Scope
[herohde] Add 3 progressive wordcount examples
[herohde] Clarify comments in wordcount pipelines
[herohde] Add apache 2.0 license to files
[herohde] Updates to examples.
[herohde] Adding more godoc for the main beam package.
[herohde] Update to new proto structure
[herohde] Split Combine and fields in to global and per-key variants
[herohde] Refactor Flatten of a single []T into Explode
[herohde] Rename local runner to direct runner
[herohde] Fix argument index error in ParDo execution
[herohde] Add Apache copyright header to files that need it.
[herohde] Made debug.Head not just work per bundle
[herohde] Impose a total ordering on Fn parameters.
[herohde] Rename Dedup to Distinct for consistency with other SDKs
[herohde] Add coder to model coder translation
[herohde] Simplify harness coder translation
[herohde] Split Pipeline into Pipeline and Scope
[herohde] Relocate Go SDK code
[herohde] Fix Go SDK maven build
[herohde] Move Go SKD to latest version of bigquery
[herohde] Add Go SDK container image
[herohde] Add Go SDK README
[herohde] Update version for Go Dataflow pipelines
[herohde] Make Scope a value type
[herohde] Add Go graph/pipeline translation
[herohde] Stage Go model pipeline for Dataflow
[herohde] Use pipeline unmarhaller in runtime harness
[herohde] CR: [BEAM-3287] Use model pipelines in Go SDK
[herohde] CR: [BEAM-3287] Use model pipelines in Go SDK
[herohde] Fix name of syscallx ErrUnsupported
[herohde] Allow any named type to be registered and serialized as external
[herohde] Add more package comments for core packages
[herohde] Make Go SDK External a graph primitive
[herohde] Cache Go runtime symbol lookups
[wcn] Fix storagePath variable.
[wcn] BEAM-3368 fix translation for external
[robertwb] [BEAM-3356] Add Go SDK int and varint custom coders (#4276)
[lcwik] BEAM-3361 Increase Go gRPC message size
[herohde] Go SDK runtime revamp
[lcwik] Add a few function call overhead benchmarks
[lcwik] Add type-specialized emitters
[lcwik] BEAM-3324 improve symtab memory usage
[lcwik] BEAM-3324 improve symtab memory usage
[lcwik] BEAM-3324 improve symtab memory usage
[lcwik] Store objects in pool so they can be reused.
[lcwik] Add builtin varint coder
[herohde] Type-specialize custom decoders and encoders in Go SDK runtime
[herohde] Type-specialize iterators and side input in the Go SDK
[herohde] Add warnings if Go runtime registrations are overwritten
[herohde] Add reusable element coders in Go SDK runtime
[wcn] Updated translater to preserve payload and its URN.
[herohde] Initial version of type-specialized general reflective calls
[herohde] Add general-purpose untyped callers in Go SDK runtime
[herohde] Use fast caller for filter transform predicate
[herohde] CR: Clarified comment on caller template
[herohde] Fix value encoding for Create
[lcwik] BEAM-3473: Fix GroupByKey iterators to be initialized.
[lcwik] BEAM-3474 Include stacks in panic messages.
[lcwik] BEAM-3299: Add source reporting support.
[lcwik] Remove GetId() call from under lock.
[lcwik] Add additional comments about concurrency invariants.
[lcwik] Add initialization of active plans map.
[lcwik] Renamed Go runtime Caller to Func and added name
[lcwik] Use reflectx.Func as the fundamental function representation
[lcwik] CR: fix DynFn comments
[lcwik] CR: fix comments
[herohde] Avoid reflect.Value conversions in Go runtime
[wcn] Allow grpcx.Dial to support overrides.
[rober] Use a typeswitch instead of reflect.Convert when encoding strings or
[github] Update coder.go
[robert] Replace reflective convert to direct convert.
[rober] Fix beam.Combine to combine globally
[herohde] Add optional function registration to Go SDK runtime
[rober] fixup! Remove reflection from varint codecs
[herohde] Changed core GBK to CoGBK
[herohde] Add CoGBK support to direct runner and Join example
[herohde] [BEAM-3316] Translate bundle descriptors directly to execution plans in
[herohde] Translate CoGBK into GBK for Dataflow and model pipeline runners
[herohde] CR: [BEAM-3302] Support CoGBK in the Go SDK
[herohde] [BEAM-3579] Fix textio.Write
[herohde] CR: fix Go SDK textio.Write
[wcn] Fixing filename.
[ehudm] Integration test for Python HDFS implementation.
[wcn] Improve rendering of DOT diagrams.
[herohde] Update Go SDK coder constants
[1028332163] replace mockito-all
[1028332163] replace mockito
[1028332163] replace mockito-all and harcrest-all
[1028332163] replace mockito-all and harcrest-all
[1028332163] replace mockito-all and harcrest-all
[1028332163] replace mockito-all and harcrest-all
[1028332163] replace mockito-all and harcrest-all
[1028332163] replace mockito-all and harcrest-all
[1028332163] replace mockito-all and harcrest-all
[1028332163] replace mockito-all and harcrest-all
[1028332163] ignoredUnusedDeclaredDependencies
[1028332163] ban hamcrest-all and mockito-all
[1028332163] ban mockito-all and hamcrest-all
[ehudm] Don't cache pubsub subscription prematurely.
[tgroh] Make Impulse#create() visible
[kirpichov] Makes it possible to use Wait with default windowing, eg. in batch
[grzegorz.kolakowski] [BEAM-3043] Set user-specified PTransform names on Flink operators
[coheigea] Avoid unnecessary autoboxing by replacing Integer/Long.valueOf with
[tgroh] Update QueryablePipeline Factory Method Name
[tgroh] Scan Core Construction NeedsRunner Tests
[klk] Add MetricsTranslation
[rmannibucau] extracting the scheduled executor service in a factory variable in SDF
[echauchot] [BEAM-3681] Make S3FileSystem copy atomic for smaller than 5GB objects
[echauchot] [BEAM-3681] Add a comment for the extra check of objectSize in
[grzegorz.kolakowski] [BEAM-3753] Fix failing integration tests
[grzegorz.kolakowski] [BEAM-3753] Rename *ITCase.java tests files to *Test.java
[lcwik] [BEAM-3762] Update Dataflow worker container image to support unlimited
[github] Bump container image tag to fix incompatibility between the SDK and the
[github] Update dependency.py
[sidhom] Run NeedsRunner tests from direct runner gradle build
[ccy] Fix issue from incomplete removal of has_cache
[yifanzou] [BEAM-3735] copy mobile-gaming sources in to archetypes
[mairbek] Update SpannerIO to use Batch API.
[sidhom] Address review comments
[sidhom] Remove old sourceSets.test.output references
[rangadi] Upate Readme and JavaDoc for KafkaIO.
[robertwb] Avoid warning in our default runner.
[github] [BEAM-3719] Adds support for reading side-inputs from SDFs
[github] print() is a function in Python 3
[ehudm] Add Python lint check for calls to unittest.main.
[tgroh] Add JavaReadViaImpulse to core-construction
[robertwb] [maven-release-plugin] prepare branch release-2.4.0
[robertwb] [maven-release-plugin] prepare for next development iteration
[robertwb] Bump Python dev version.
[tgroh] Revert "extracting the scheduled executor service in a factory variable
[rangadi] Mention support for older versions.
[amyrvold] Beam runner inventory, run as a cron job, not on each CL
[github] Fixing formatting bug in filebasedsink.py.
[github] Fix lint issue.
[ehudm] Add support for Pub/Sub messages with attributes.
[amyrvold] [BEAM-3775] Increase timeout in
[ehudm] Address review comment from PR #4744.
[mariagh] Add TestClock to test
[wcn] Update generated code to match import from 5e6db92
[XuMingmin] [BEAM-3754]: Fix readBytes() initialization (#4792)
[daniel.o.programmer] [BEAM-3126] Fixing incorrect function call in bundle processor.
[cclauss] Change unicode --> six.text_type for Python 3
[samuel.waggoner] [BEAM-3777] allow UDAFs to be indirect subclasses of CombineFn
[ehudm] Improve FileBasedSink rename safety.
[ehudm] Add missing import statements.
[lcwik] [BEAM-3690] ban mockito-all and hamcrest-all
[kenn] Bump sdks/go/container.pom.xml to 2.5.0-SNAPSHOT
[amyrvold] [BEAM-3791] Update version number in build_rules.gradle
[rmannibucau] Make StateInternals short state method defaulting to the implementation
------------------------------------------
[...truncated 44.62 KB...]
[INFO] Excluding com.google.api.grpc:proto-google-longrunning-v1:jar:0.1.11 from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-protos:jar:1.0.0-pre3 from the shaded jar.
[INFO] Excluding com.google.cloud.bigtable:bigtable-client-core:jar:1.0.0 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-appengine:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.opencensus:opencensus-contrib-grpc-util:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.opencensus:opencensus-api:jar:0.7.0 from the shaded jar.
[INFO] Excluding io.netty:netty-tcnative-boringssl-static:jar:1.1.33.Fork26 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-cloud-spanner-admin-database-v1:jar:0.1.9 from the shaded jar.
[INFO] Excluding com.google.api.grpc:proto-google-common-protos:jar:0.1.9 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.http-client:google-http-client-jackson2:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-dataflow:jar:v1b3-rev221-1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-clouddebugger:jar:v2-rev8-1.22.0 from the shaded jar.
[INFO] Excluding com.google.apis:google-api-services-storage:jar:v1-rev71-1.22.0 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-credentials:jar:0.7.1 from the shaded jar.
[INFO] Excluding com.google.auth:google-auth-library-oauth2-http:jar:0.7.1 from the shaded jar.
[INFO] Excluding com.google.cloud.bigdataoss:util:jar:1.4.5 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-java6:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.api-client:google-api-client-jackson2:jar:1.22.0 from the shaded jar.
[INFO] Excluding com.google.oauth-client:google-oauth-client-java6:jar:1.22.0 from the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/beam-sdks-java-io-hadoop-input-format-2.5.0-SNAPSHOT.jar> with <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/beam-sdks-java-io-hadoop-input-format-2.5.0-SNAPSHOT-shaded.jar>
[INFO] Replacing original test artifact with shaded test artifact.
[INFO] Replacing <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/beam-sdks-java-io-hadoop-input-format-2.5.0-SNAPSHOT-tests.jar> with <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/beam-sdks-java-io-hadoop-input-format-2.5.0-SNAPSHOT-shaded-tests.jar>
[INFO] Dependency-reduced POM written at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/dependency-reduced-pom.xml>
[INFO]
[INFO] --- maven-failsafe-plugin:2.20.1:integration-test (default) @ beam-sdks-java-io-hadoop-input-format ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports>
[INFO] parallel='all', perCoreThreadCount=true, threadCount=4, useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, threadCountMethods=0, parallelOptimized=true
[INFO]
[INFO] -------------------------------------------------------
[INFO] T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT
[ERROR] Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 0 s <<< FAILURE! - in org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT
[ERROR] org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT Time elapsed: 0 s <<< ERROR!
org.postgresql.util.PSQLException: The connection attempt failed.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:215)
at org.postgresql.Driver.makeConnection(Driver.java:404)
at org.postgresql.Driver.connect(Driver.java:272)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:86)
at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:71)
at org.apache.beam.sdk.io.common.DatabaseTestHelper.createTable(DatabaseTestHelper.java:46)
at org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT.setUp(HadoopInputFormatIOIT.java:92)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runners.Suite.runChild(Suite.java:128)
at org.junit.runners.Suite.runChild(Suite.java:27)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.postgresql.core.PGStream.<init>(PGStream.java:61)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:144)
... 29 more
[ERROR] org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT Time elapsed: 0 s <<< ERROR!
org.postgresql.util.PSQLException: The connection attempt failed.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:272)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:215)
at org.postgresql.Driver.makeConnection(Driver.java:404)
at org.postgresql.Driver.connect(Driver.java:272)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:86)
at org.postgresql.ds.common.BaseDataSource.getConnection(BaseDataSource.java:71)
at org.apache.beam.sdk.io.common.DatabaseTestHelper.deleteTable(DatabaseTestHelper.java:57)
at org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT.tearDown(HadoopInputFormatIOIT.java:118)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runners.Suite.runChild(Suite.java:128)
at org.junit.runners.Suite.runChild(Suite.java:27)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.postgresql.core.PGStream.<init>(PGStream.java:61)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:144)
... 28 more
[INFO]
[INFO] Results:
[INFO]
[ERROR] Errors:
[ERROR] org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT.org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT
[ERROR] Run 1: HadoopInputFormatIOIT.setUp:92 PSQL The connection attempt failed.
[ERROR] Run 2: HadoopInputFormatIOIT.tearDown:118 PSQL The connection attempt failed.
[INFO]
[INFO]
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO]
[INFO]
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-hadoop-input-format ---
[INFO] No dependency problems found
[INFO]
[INFO] --- maven-failsafe-plugin:2.20.1:verify (default) @ beam-sdks-java-io-hadoop-input-format ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:24 min
[INFO] Finished at: 2018-03-07T18:03:44Z
[INFO] Final Memory: 123M/1407M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.20.1:verify (default) on project beam-sdks-java-io-hadoop-input-format: There are test failures.
[ERROR]
[ERROR] Please refer to <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports> for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.20.1:verify (default) on project beam-sdks-java-io-hadoop-input-format: There are test failures.
Please refer to <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.
Please refer to <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/beam/sdks/java/io/hadoop-input-format/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
at org.apache.maven.plugin.surefire.SurefireHelper.throwException (SurefireHelper.java:240)
at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:112)
at org.apache.maven.plugin.failsafe.VerifyMojo.execute (VerifyMojo.java:188)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
STDERR:
2018-03-07 18:03:45,784 60d52338 MainThread beam_integration_benchmark(1/1) ERROR Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 622, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 525, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
job_type=job_type)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-07 18:03:45,846 60d52338 MainThread beam_integration_benchmark(1/1) INFO Cleaning up benchmark beam_integration_benchmark
2018-03-07 18:03:45,846 60d52338 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1520437858447> delete -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml>
2018-03-07 18:03:47,312 60d52338 MainThread beam_integration_benchmark(1/1) ERROR Exception running benchmark
Traceback (most recent call last):
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 732, in RunBenchmarkTask
RunBenchmark(spec, collector)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 622, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 525, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
job_type=job_type)
File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-03-07 18:03:47,312 60d52338 MainThread beam_integration_benchmark(1/1) ERROR Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-03-07 18:03:47,312 60d52338 MainThread beam_integration_benchmark(1/1) INFO Benchmark run statuses:
---------------------------------------------------------------------------------
Name UID Status Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark beam_integration_benchmark0 FAILED
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-03-07 18:03:47,312 60d52338 MainThread beam_integration_benchmark(1/1) INFO Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/pkb.log>
2018-03-07 18:03:47,313 60d52338 MainThread beam_integration_benchmark(1/1) INFO Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/60d52338/completion_statuses.json>
Build step 'Execute shell' marked build as failure