You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/18 06:45:50 UTC

Build failed in Jenkins: beam_PerformanceTests_HadoopFormat #194

See <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/194/display/redirect?page=changes>

Changes:

[millsd] Use StateTags.ID_EQUIVALENCE when using comparing StateTags

[amaliujia] [BEAM-7461] disalbe flaky tests due to flaky

[github] Revert "[BEAM-7513] Adding Row Count for Bigquery Table"

[github] [BEAM-7467] Add dependency classifier to published pom (#8868)

[github] Fixing file naming for windows (#8870)

------------------------------------------
[...truncated 539.93 KB...]
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:50.847Z: Fusing consumer PAssert$0/GroupGlobally/RewindowActuals/Window.Assign into PAssert$0/GroupGlobally/GatherAllOutputs/Values/Values/Map
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:50.893Z: Fusing consumer PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow into PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Read
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:50.942Z: Fusing consumer PAssert$0/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign into PAssert$0/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:50.988Z: Fusing consumer PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Reify into PAssert$0/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.036Z: Unzipping flatten s13-u44-u49 for input s15.org.apache.beam.sdk.values.PCollection.<init>:402#b96b6624e7817fb3-c47
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.076Z: Fusing unzipped copy of PAssert$0/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map, through flatten Calculate hashcode/Flatten.PCollections/Unzipped-2, into producer PAssert$0/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.113Z: Fusing consumer Calculate hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.160Z: Fusing consumer PAssert$0/GroupGlobally/Window.Into()/Window.Assign into Calculate hashcode/Values/Values/Map
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.180Z: Fusing consumer Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) into Calculate hashcode/Values/Values/Map
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.218Z: Fusing consumer PAssert$0/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map into PAssert$0/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.265Z: Fusing consumer PAssert$0/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) into PAssert$0/GroupGlobally/Window.Into()/Window.Assign
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.305Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.344Z: Fusing consumer Get values only/Values/Map into Collect read time
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.391Z: Fusing consumer Values as string into Get values only/Values/Map
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.430Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.465Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.497Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.541Z: Fusing consumer Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate hashcode/WithKeys/AddKeys/Map
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.579Z: Fusing consumer Calculate hashcode/Values/Values/Map into Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.625Z: Fusing consumer Calculate hashcode/WithKeys/AddKeys/Map into Values as string
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.663Z: Fusing consumer Collect read time into Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.709Z: Fusing consumer PAssert$0/GroupGlobally/GroupDummyAndContents/Reify into PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:51.744Z: Fusing consumer PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign into PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.247Z: Executing operation PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Create
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.292Z: Executing operation PAssert$0/GroupGlobally/GroupDummyAndContents/Create
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.330Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.349Z: Starting 1 workers in us-central1-a...
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.409Z: Finished operation PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Create
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.434Z: Finished operation PAssert$0/GroupGlobally/GroupDummyAndContents/Create
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.434Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Create
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.653Z: Executing operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Jun 18, 2019 6:42:53 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:42:52.699Z: Executing operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
    Jun 18, 2019 6:43:39 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:43:38.126Z: Workers have started successfully.
    Jun 18, 2019 6:44:07 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2019-06-18T06:44:06.609Z: java.lang.RuntimeException: java.lang.RuntimeException: org.postgresql.util.PSQLException: The connection attempt failed.
    	at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.setConf(DBInputFormat.java:171)
    	at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$HadoopInputFormatBoundedSource.createInputFormatInstance(HadoopFormatIO.java:721)
    	at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$HadoopInputFormatBoundedSource.computeSplitsIfNecessary(HadoopFormatIO.java:677)
    	at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$HadoopInputFormatBoundedSource.split(HadoopFormatIO.java:639)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:284)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:206)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:190)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:169)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:78)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: java.lang.RuntimeException: org.postgresql.util.PSQLException: The connection attempt failed.
    	at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.createConnection(DBInputFormat.java:205)
    	at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.setConf(DBInputFormat.java:164)
    	... 18 more
    Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
    	at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:257)
    	at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
    	at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:195)
    	at org.postgresql.Driver.makeConnection(Driver.java:452)
    	at org.postgresql.Driver.connect(Driver.java:254)
    	at java.sql.DriverManager.getConnection(DriverManager.java:664)
    	at java.sql.DriverManager.getConnection(DriverManager.java:247)
    	at org.apache.hadoop.mapreduce.lib.db.DBConfiguration.getConnection(DBConfiguration.java:154)
    	at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.createConnection(DBInputFormat.java:198)
    	... 19 more
    Caused by: java.net.SocketTimeoutException: connect timed out
    	at java.net.PlainSocketImpl.socketConnect(Native Method)
    	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    	at java.net.Socket.connect(Socket.java:589)
    	at org.postgresql.core.PGStream.<init>(PGStream.java:69)
    	at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:156)
    	... 27 more

    Jun 18, 2019 6:44:07 AM org.apache.beam.runners.dataflow.TestDataflowRunner$ErrorMonitorMessagesHandler process
    INFO: Dataflow job 2019-06-17_23_42_34-7282726993535903487 threw exception. Failure message was: java.lang.RuntimeException: java.lang.RuntimeException: org.postgresql.util.PSQLException: The connection attempt failed.
    	at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.setConf(DBInputFormat.java:171)
    	at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$HadoopInputFormatBoundedSource.createInputFormatInstance(HadoopFormatIO.java:721)
    	at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$HadoopInputFormatBoundedSource.computeSplitsIfNecessary(HadoopFormatIO.java:677)
    	at org.apache.beam.sdk.io.hadoop.format.HadoopFormatIO$HadoopInputFormatBoundedSource.split(HadoopFormatIO.java:639)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:284)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:206)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:190)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:169)
    	at org.apache.beam.runners.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:78)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:411)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:380)
    	at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:305)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
    	at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
    	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    	at java.lang.Thread.run(Thread.java:745)
    Caused by: java.lang.RuntimeException: org.postgresql.util.PSQLException: The connection attempt failed.
    	at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.createConnection(DBInputFormat.java:205)
    	at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.setConf(DBInputFormat.java:164)
    	... 18 more
    Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
    	at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:257)
    	at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
    	at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:195)
    	at org.postgresql.Driver.makeConnection(Driver.java:452)
    	at org.postgresql.Driver.connect(Driver.java:254)
    	at java.sql.DriverManager.getConnection(DriverManager.java:664)
    	at java.sql.DriverManager.getConnection(DriverManager.java:247)
    	at org.apache.hadoop.mapreduce.lib.db.DBConfiguration.getConnection(DBConfiguration.java:154)
    	at org.apache.hadoop.mapreduce.lib.db.DBInputFormat.createConnection(DBInputFormat.java:198)
    	... 19 more
    Caused by: java.net.SocketTimeoutException: connect timed out
    	at java.net.PlainSocketImpl.socketConnect(Native Method)
    	at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    	at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    	at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    	at java.net.Socket.connect(Socket.java:589)
    	at org.postgresql.core.PGStream.<init>(PGStream.java:69)
    	at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:156)
    	... 27 more

    Jun 18, 2019 6:44:48 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:44:46.435Z: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
    Jun 18, 2019 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:02.051Z: Finished operation Read using Hadoop InputFormat/Read(HadoopInputFormatBoundedSource)+Collect read time+Get values only/Values/Map+Values as string+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write
    Jun 18, 2019 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:02.169Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Jun 18, 2019 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:02.206Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Close
    Jun 18, 2019 6:45:03 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:02.324Z: Executing operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Window.Into()/Window.Assign+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Reify+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Write
    Jun 18, 2019 6:45:05 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:04.297Z: Finished operation PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
    Jun 18, 2019 6:45:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:10.369Z: Finished operation Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Read+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract+Calculate hashcode/Values/Values/Map+PAssert$0/GroupGlobally/Window.Into()/Window.Assign+Calculate hashcode/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+PAssert$0/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Reify+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Write
    Jun 18, 2019 6:45:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:10.384Z: Workers have started successfully.
    Jun 18, 2019 6:45:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:10.578Z: Executing operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Jun 18, 2019 6:45:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:10.636Z: Finished operation Calculate hashcode/View.AsIterable/CreateDataflowView
    Jun 18, 2019 6:45:12 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:10.836Z: Executing operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Window.Into()/Window.Assign+PAssert$0/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Reify+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Write
    Jun 18, 2019 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:13.491Z: Finished operation Calculate hashcode/CreateVoid/Read(CreateSource)+Calculate hashcode/ProduceDefault+PAssert$0/GroupGlobally/Window.Into()/Window.Assign+PAssert$0/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)+PAssert$0/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map+PAssert$0/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Reify+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Write
    Jun 18, 2019 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:13.595Z: Executing operation PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Close
    Jun 18, 2019 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:13.643Z: Finished operation PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Close
    Jun 18, 2019 6:45:15 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:13.745Z: Executing operation PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Read+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/GatherAllOutputs/Values/Values/Map+PAssert$0/GroupGlobally/RewindowActuals/Window.Assign+PAssert$0/GroupGlobally/KeyForDummy/AddKeys/Map+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
    Jun 18, 2019 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:18.969Z: Finished operation PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Read+PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow+PAssert$0/GroupGlobally/GatherAllOutputs/Values/Values/Map+PAssert$0/GroupGlobally/RewindowActuals/Window.Assign+PAssert$0/GroupGlobally/KeyForDummy/AddKeys/Map+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
    Jun 18, 2019 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:19.073Z: Executing operation PAssert$0/GroupGlobally/GroupDummyAndContents/Close
    Jun 18, 2019 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:19.121Z: Finished operation PAssert$0/GroupGlobally/GroupDummyAndContents/Close
    Jun 18, 2019 6:45:19 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:19.202Z: Executing operation PAssert$0/GroupGlobally/GroupDummyAndContents/Read+PAssert$0/GroupGlobally/GroupDummyAndContents/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Jun 18, 2019 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:23.721Z: Finished operation PAssert$0/GroupGlobally/GroupDummyAndContents/Read+PAssert$0/GroupGlobally/GroupDummyAndContents/GroupByWindow+PAssert$0/GroupGlobally/Values/Values/Map+PAssert$0/GroupGlobally/ParDo(Concat)+PAssert$0/GetPane/Map+PAssert$0/RunChecks+PAssert$0/VerifyAssertions/ParDo(DefaultConclude)
    Jun 18, 2019 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:23.976Z: Cleaning up.
    Jun 18, 2019 6:45:25 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2019-06-18T06:45:24.063Z: Stopping worker pool...

STDERR: Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

2019-06-18 06:45:46,797 20523ef8 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-06-18 06:45:46,799 20523ef8 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2019-06-18 06:45:46,800 20523ef8 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-194> delete -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml> --ignore-not-found
2019-06-18 06:45:47,528 20523ef8 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-06-18 06:45:47,529 20523ef8 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2019-06-18 06:45:47,530 20523ef8 MainThread beam_integration_benchmark(1/1) INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-06-18 06:45:47,530 20523ef8 MainThread beam_integration_benchmark(1/1) INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/20523ef8/pkb.log>
2019-06-18 06:45:47,530 20523ef8 MainThread beam_integration_benchmark(1/1) INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/20523ef8/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PerformanceTests_HadoopFormat #195

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/195/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org